| Interface | Description |
|---|---|
| AnalyserOptions | |
| AnalyserOptions.Builder | |
| AudioBufferOptions | |
| AudioBufferOptions.Builder | |
| AudioBufferSourceOptions | |
| AudioBufferSourceOptions.Builder | |
| AudioContextLatencyCategoryOrDoubleUnion | |
| AudioContextOptions |
The AudioContext() constructor creates a new AudioContext object which represents an audio-processing graph, built from audio modules linked together, each represented by an AudioNode.
|
| AudioContextOptions.Builder |
The AudioContext() constructor creates a new AudioContext object which represents an audio-processing graph, built from audio modules linked together, each represented by an AudioNode.
|
| AudioNodeOptions |
The AudioNode interface is a generic interface for representing an audio processing module.
|
| AudioNodeOptions.Builder |
The AudioNode interface is a generic interface for representing an audio processing module.
|
| AudioParamDescriptor |
The AudioParamDescriptor dictionary of the Web Audio API specifies properties for AudioParam objects.
|
| AudioParamDescriptor.Builder |
The AudioParamDescriptor dictionary of the Web Audio API specifies properties for AudioParam objects.
|
| AudioParamMap.ForEachCallback | |
| AudioParamMap.ForEachCallback2 | |
| AudioParamMap.ForEachCallback3 | |
| AudioProcessingEventHandler |
Handle events of type AudioProcessingEvent
|
| AudioProcessingEventInit | |
| AudioProcessingEventInit.Builder | |
| AudioProcessingEventListener |
Listener for events of type AudioProcessingEvent
|
| AudioTimestamp | |
| AudioTimestamp.Builder | |
| AudioWorkletNodeOptions |
The AudioWorkletNode() constructor creates a new AudioWorkletNode object, which represents an AudioNode that uses a JavaScript function to perform custom audio processing.
|
| AudioWorkletNodeOptions.Builder |
The AudioWorkletNode() constructor creates a new AudioWorkletNode object, which represents an AudioNode that uses a JavaScript function to perform custom audio processing.
|
| AudioWorkletProcessCallback | |
| AudioWorkletProcessorConstructor | |
| BiquadFilterOptions | |
| BiquadFilterOptions.Builder | |
| ChannelMergerOptions | |
| ChannelMergerOptions.Builder | |
| ChannelSplitterOptions | |
| ChannelSplitterOptions.Builder | |
| ConstantSourceOptions | |
| ConstantSourceOptions.Builder | |
| ConvolverOptions | |
| ConvolverOptions.Builder | |
| DecodeErrorCallback | |
| DecodeSuccessCallback | |
| DelayOptions | |
| DelayOptions.Builder | |
| DynamicsCompressorOptions | |
| DynamicsCompressorOptions.Builder | |
| GainOptions | |
| GainOptions.Builder | |
| IIRFilterOptions | |
| IIRFilterOptions.Builder | |
| MediaElementAudioSourceOptions | |
| MediaElementAudioSourceOptions.Builder | |
| MediaStreamAudioSourceOptions |
The MediaStreamAudioSourceOptions dictionary provides configuration options used when creating a MediaStreamAudioSourceNode using its constructor.
|
| MediaStreamAudioSourceOptions.Builder |
The MediaStreamAudioSourceOptions dictionary provides configuration options used when creating a MediaStreamAudioSourceNode using its constructor.
|
| MediaStreamTrackAudioSourceOptions |
The MediaStreamTrackAudioSourceOptions dictionary is used when specifying options to the MediaStreamTrackAudioSourceNode() constructor.
|
| MediaStreamTrackAudioSourceOptions.Builder |
The MediaStreamTrackAudioSourceOptions dictionary is used when specifying options to the MediaStreamTrackAudioSourceNode() constructor.
|
| OfflineAudioCompletionEventHandler |
Handle events of type OfflineAudioCompletionEvent
|
| OfflineAudioCompletionEventInit | |
| OfflineAudioCompletionEventInit.Builder | |
| OfflineAudioCompletionEventListener |
Listener for events of type OfflineAudioCompletionEvent
|
| OfflineAudioContextOptions | |
| OfflineAudioContextOptions.Builder | |
| OscillatorOptions | |
| OscillatorOptions.Builder | |
| PannerOptions | |
| PannerOptions.Builder | |
| PeriodicWaveConstraints | |
| PeriodicWaveConstraints.Builder | |
| PeriodicWaveOptions | |
| PeriodicWaveOptions.Builder | |
| StereoPannerOptions | |
| StereoPannerOptions.Builder | |
| WaveShaperOptions | |
| WaveShaperOptions.Builder |
| Class | Description |
|---|---|
| AnalyserNode |
The AnalyserNode interface represents a node able to provide real-time frequency and time-domain analysis information.
|
| AudioBuffer |
The AudioBuffer interface represents a short audio asset residing in memory, created from an audio file using the AudioContext.decodeAudioData() method, or from raw data using AudioContext.createBuffer().
|
| AudioBufferSourceNode |
The AudioBufferSourceNode interface is an AudioScheduledSourceNode which represents an audio source consisting of in-memory audio data, stored in an AudioBuffer.
|
| AudioContext |
The AudioContext interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode.
|
| AudioContextLatencyCategory.Util | |
| AudioContextState.Util | |
| AudioDestinationNode |
The AudioDestinationNode interface represents the end destination of an audio graph in a given context — usually the speakers of your device.
|
| AudioListener |
The AudioListener interface represents the position and orientation of the unique person listening to the audio scene, and is used in audio spatialization.
|
| AudioNode |
The AudioNode interface is a generic interface for representing an audio processing module.
|
| AudioParam |
The Web Audio API's AudioParam interface represents an audio-related parameter, usually a parameter of an AudioNode (such as GainNode.gain).
|
| AudioParamMap |
The Web Audio API interface AudioParamMap represents a set of multiple audio parameters, each described as a mapping of a DOMString identifying the parameter to the AudioParam object representing its value.
|
| AudioParamMap.Entry | |
| AudioProcessingEvent |
The Web Audio API AudioProcessingEvent represents events that occur when a ScriptProcessorNode input buffer is ready to be processed.
|
| AudioScheduledSourceNode |
The AudioScheduledSourceNode interface—part of the Web Audio API—is a parent interface for several types of audio source node interfaces which share the ability to be started and stopped, optionally at specified times.
|
| AudioWorklet |
The AudioWorklet interface of the Web Audio API is used to supply custom audio processing scripts that execute in a separate thread to provide very low latency audio processing.
|
| AudioWorkletGlobal |
The AudioWorkletGlobalScope interface of the Web Audio API represents a global execution context for user-supplied code, which defines custom AudioWorkletProcessor-derived classes.
|
| AudioWorkletGlobalScope |
The AudioWorkletGlobalScope interface of the Web Audio API represents a global execution context for user-supplied code, which defines custom AudioWorkletProcessor-derived classes.
|
| AudioWorkletNode |
The AudioWorkletNode interface of the Web Audio API represents a base class for a user-defined AudioNode, which can be connected to an audio routing graph along with other nodes.
|
| AudioWorkletProcessor |
The AudioWorkletProcessor interface of the Web Audio API represents an audio processing code behind a custom AudioWorkletNode.
|
| AutomationRate.Util | |
| BaseAudioContext |
The BaseAudioContext interface of the Web Audio API acts as a base definition for online and offline audio-processing graphs, as represented by AudioContext and OfflineAudioContext respectively.
|
| BiquadFilterNode |
The BiquadFilterNode interface represents a simple low-order filter, and is created using the BaseAudioContext/createBiquadFilter method.
|
| BiquadFilterType.Util | |
| ChannelCountMode.Util | |
| ChannelInterpretation.Util | |
| ChannelMergerNode |
The ChannelMergerNode interface, often used in conjunction with its opposite, ChannelSplitterNode, reunites different mono inputs into a single output.
|
| ChannelSplitterNode |
The ChannelSplitterNode interface, often used in conjunction with its opposite, ChannelMergerNode, separates the different channels of an audio source into a set of mono outputs.
|
| ConstantSourceNode |
The ConstantSourceNode interface—part of the Web Audio API—represents an audio source (based upon AudioScheduledSourceNode) whose output is single unchanging value.
|
| ConvolverNode |
The ConvolverNode interface is an AudioNode that performs a Linear Convolution on a given AudioBuffer, often used to achieve a reverb effect.
|
| DelayNode |
The DelayNode interface represents a delay-line; an AudioNode audio-processing module that causes a delay between the arrival of an input data and its propagation to the output.
|
| DistanceModelType.Util | |
| DynamicsCompressorNode |
The DynamicsCompressorNode interface provides a compression effect, which lowers the volume of the loudest parts of the signal in order to help prevent clipping and distortion that can occur when multiple sounds are played and multiplexed together at once.
|
| GainNode |
The GainNode interface represents a change in volume.
|
| IIRFilterNode |
The IIRFilterNode interface of the Web Audio API is a AudioNode processor which implements a general infinite impulse response (IIR) filter; this type of filter can be used to implement tone control devices and graphic equalizers as well.
|
| MediaElementAudioSourceNode |
The MediaElementAudioSourceNode interface represents an audio source consisting of an HTML5 <audio> or <video> element.
|
| MediaStreamAudioDestinationNode |
The MediaStreamAudioDestinationNode interface represents an audio destination consisting of a WebRTC MediaStream with a single AudioMediaStreamTrack, which can be used in a similar way to a MediaStream obtained from Navigator.getUserMedia().
|
| MediaStreamAudioSourceNode |
The MediaStreamAudioSourceNode interface is a type of AudioNode which operates as an audio source whose media is received from a MediaStream obtained using the WebRTC or Media Capture and Streams APIs.
|
| MediaStreamTrackAudioSourceNode |
The MediaStreamTrackAudioSourceNode interface is a type of AudioNode which represents a source of audio data taken from a specific MediaStreamTrack obtained through the WebRTC or Media Capture and Streams APIs.
|
| OfflineAudioCompletionEvent |
The Web Audio API OfflineAudioCompletionEvent interface represents events that occur when the processing of an OfflineAudioContext is terminated.
|
| OfflineAudioContext |
The OfflineAudioContext interface is an AudioContext interface representing an audio-processing graph built from linked together AudioNodes.
|
| OscillatorNode |
The OscillatorNode interface represents a periodic waveform, such as a sine wave.
|
| OscillatorType.Util | |
| OverSampleType.Util | |
| PannerNode |
The PannerNode interface represents the position and behavior of an audio source signal in space.
|
| PanningModelType.Util | |
| PeriodicWave |
The PeriodicWave interface defines a periodic waveform that can be used to shape the output of an OscillatorNode.
|
| ScriptProcessorNode |
The ScriptProcessorNode interface allows the generation, processing, or analyzing of audio using JavaScript.
|
| StereoPannerNode |
The StereoPannerNode interface of the Web Audio API represents a simple stereo panner node that can be used to pan an audio stream left or right.
|
| WaveShaperNode |
The WaveShaperNode interface represents a non-linear distorter.
|
| Annotation Type | Description |
|---|---|
| AudioContextLatencyCategory | |
| AudioContextState | |
| AutomationRate | |
| BiquadFilterType | |
| ChannelCountMode | |
| ChannelInterpretation | |
| DistanceModelType | |
| OscillatorType | |
| OverSampleType | |
| PanningModelType |