@Generated(value="org.realityforge.webtack")
@JsType(isNative=true,
namespace="<global>",
name="BaseAudioContext")
public class BaseAudioContext
extends EventTarget
| Modifier and Type | Field and Description |
|---|---|
@JsNullable EventHandler |
onstatechange |
| Modifier | Constructor and Description |
|---|---|
protected |
BaseAudioContext() |
| Modifier and Type | Method and Description |
|---|---|
void |
addStatechangeListener(EventListener callback) |
void |
addStatechangeListener(EventListener callback,
AddEventListenerOptions options) |
void |
addStatechangeListener(EventListener callback,
boolean useCapture) |
AudioWorklet |
audioWorklet()
The audioWorklet read-only property of the BaseAudioContext interface returns an instance of AudioWorklet that can be used for adding AudioWorkletProcessor-derived classes which implement custom audio processing.
|
@JsNonNull AnalyserNode |
createAnalyser()
The createAnalyser() method of the BaseAudioContext interface creates an AnalyserNode, which can be used to expose audio time and frequency data and create data visualisations.
|
@JsNonNull BiquadFilterNode |
createBiquadFilter()
A BiquadFilterNode.
|
@JsNonNull AudioBuffer |
createBuffer(int numberOfChannels,
int length,
float sampleRate)
An AudioBuffer configured based on the specified options.
|
@JsNonNull AudioBufferSourceNode |
createBufferSource()
An AudioBufferSourceNode.
|
@JsNonNull ChannelMergerNode |
createChannelMerger()
A ChannelMergerNode.
|
@JsNonNull ChannelMergerNode |
createChannelMerger(int numberOfInputs)
A ChannelMergerNode.
|
@JsNonNull ChannelSplitterNode |
createChannelSplitter()
A ChannelSplitterNode.
|
@JsNonNull ChannelSplitterNode |
createChannelSplitter(int numberOfOutputs)
A ChannelSplitterNode.
|
@JsNonNull ConstantSourceNode |
createConstantSource()
The createConstantSource() property of the BaseAudioContext interface creates a ConstantSourceNode object, which is an audio source that continuously outputs a monaural (one-channel) sound signal whose samples all have the same value.
|
@JsNonNull ConvolverNode |
createConvolver()
A ConvolverNode.
|
@JsNonNull DelayNode |
createDelay()
The createDelay() method of the BaseAudioContext Interface is used to create a DelayNode, which is used to delay the incoming audio signal by a certain amount of time.
|
@JsNonNull DelayNode |
createDelay(double maxDelayTime)
The createDelay() method of the BaseAudioContext Interface is used to create a DelayNode, which is used to delay the incoming audio signal by a certain amount of time.
|
@JsNonNull DynamicsCompressorNode |
createDynamicsCompressor()
Compression lowers the volume of the loudest parts of the signal and raises the volume of the softest parts.
|
@JsNonNull GainNode |
createGain()
A GainNode which takes as input one or more audio sources and outputs audio whose volume has been adjusted in gain (volume) to a level specified by the node's GainNode.gain a-rate parameter.
|
@JsNonNull IIRFilterNode |
createIIRFilter(double[] feedforward,
double... feedback)
The createIIRFilter() method of the BaseAudioContext interface creates an IIRFilterNode, which represents a general infinite impulse response (IIR) filter which can be configured to serve as various types of filter.
|
@JsNonNull IIRFilterNode |
createIIRFilter(JsArray<java.lang.Double> feedforward,
JsArray<java.lang.Double> feedback)
The createIIRFilter() method of the BaseAudioContext interface creates an IIRFilterNode, which represents a general infinite impulse response (IIR) filter which can be configured to serve as various types of filter.
|
@JsNonNull OscillatorNode |
createOscillator()
The createOscillator() method of the BaseAudioContext interface creates an OscillatorNode, a source representing a periodic waveform.
|
@JsNonNull PannerNode |
createPanner()
The panner node is spatialized in relation to the AudioContext's AudioListener (defined by the AudioContext.listener attribute), which represents the position and orientation of the person listening to the audio.
|
@JsNonNull PeriodicWave |
createPeriodicWave(double[] real,
double... imag)
The createPeriodicWave() method of the BaseAudioContext Interface is used to create a PeriodicWave, which is used to define a periodic waveform that can be used to shape the output of an OscillatorNode.
|
@JsNonNull PeriodicWave |
createPeriodicWave(double[] real,
double[] imag,
PeriodicWaveConstraints constraints)
The createPeriodicWave() method of the BaseAudioContext Interface is used to create a PeriodicWave, which is used to define a periodic waveform that can be used to shape the output of an OscillatorNode.
|
@JsNonNull PeriodicWave |
createPeriodicWave(JsArray<java.lang.Double> real,
JsArray<java.lang.Double> imag)
The createPeriodicWave() method of the BaseAudioContext Interface is used to create a PeriodicWave, which is used to define a periodic waveform that can be used to shape the output of an OscillatorNode.
|
@JsNonNull PeriodicWave |
createPeriodicWave(JsArray<java.lang.Double> real,
JsArray<java.lang.Double> imag,
PeriodicWaveConstraints constraints)
The createPeriodicWave() method of the BaseAudioContext Interface is used to create a PeriodicWave, which is used to define a periodic waveform that can be used to shape the output of an OscillatorNode.
|
@JsNonNull ScriptProcessorNode |
createScriptProcessor()
A ScriptProcessorNode.
|
@JsNonNull ScriptProcessorNode |
createScriptProcessor(int bufferSize)
A ScriptProcessorNode.
|
@JsNonNull ScriptProcessorNode |
createScriptProcessor(int bufferSize,
int numberOfInputChannels)
A ScriptProcessorNode.
|
@JsNonNull ScriptProcessorNode |
createScriptProcessor(int bufferSize,
int numberOfInputChannels,
int numberOfOutputChannels)
A ScriptProcessorNode.
|
@JsNonNull StereoPannerNode |
createStereoPanner()
A StereoPannerNode.
|
@JsNonNull WaveShaperNode |
createWaveShaper()
A WaveShaperNode.
|
double |
currentTime()
The currentTime read-only property of the BaseAudioContext interface returns a double representing an ever-increasing hardware timestamp in seconds that can be used for scheduling audio playback, visualizing timelines, etc.
|
@JsNonNull Promise<AudioBuffer> |
decodeAudioData(ArrayBuffer audioData)
The decodeAudioData() method of the BaseAudioContext Interface is used to asynchronously decode audio file data contained in an ArrayBuffer.
|
@JsNonNull Promise<AudioBuffer> |
decodeAudioData(ArrayBuffer audioData,
DecodeSuccessCallback successCallback)
The decodeAudioData() method of the BaseAudioContext Interface is used to asynchronously decode audio file data contained in an ArrayBuffer.
|
@JsNonNull Promise<AudioBuffer> |
decodeAudioData(ArrayBuffer audioData,
DecodeSuccessCallback successCallback,
DecodeErrorCallback errorCallback)
The decodeAudioData() method of the BaseAudioContext Interface is used to asynchronously decode audio file data contained in an ArrayBuffer.
|
AudioDestinationNode |
destination()
An AudioDestinationNode.
|
AudioListener |
listener()
An AudioListener object.
|
void |
removeStatechangeListener(EventListener callback) |
void |
removeStatechangeListener(EventListener callback,
boolean useCapture) |
void |
removeStatechangeListener(EventListener callback,
EventListenerOptions options) |
float |
sampleRate()
The sampleRate property of the BaseAudioContext interface returns a floating point number representing the sample rate, in samples per second, used by all nodes in this audio context.
|
java.lang.String |
state()
A DOMString.
|
addEventListener, addEventListener, addEventListener, dispatchEvent, removeEventListener, removeEventListener, removeEventListenerassign, create, create, defineProperties, defineProperty, defineProperty, entries, freeze, fromEntries, getOwnPropertyDescriptor, getOwnPropertyDescriptor, getOwnPropertyDescriptors, getOwnPropertyNames, getOwnPropertySymbols, getPrototypeOf, hasOwnProperty, hasOwnProperty, is, isExtensible, isFrozen, isPrototypeOf, isSealed, keys, preventExtensions, propertyIsEnumerable, seal, setPrototypeOf, toString_, valueOf_, valuespublic @JsNullable EventHandler onstatechange
@JsProperty(name="audioWorklet") @Nonnull public AudioWorklet audioWorklet()
@JsProperty(name="currentTime") public double currentTime()
@JsProperty(name="destination") @Nonnull public AudioDestinationNode destination()
@JsProperty(name="listener") @Nonnull public AudioListener listener()
@JsProperty(name="sampleRate") public float sampleRate()
@JsProperty(name="state") @Nonnull @AudioContextState public java.lang.String state()
public @JsNonNull AnalyserNode createAnalyser()
public @JsNonNull BiquadFilterNode createBiquadFilter()
public @JsNonNull AudioBuffer createBuffer(int numberOfChannels, int length, float sampleRate)
public @JsNonNull AudioBufferSourceNode createBufferSource()
public @JsNonNull ChannelMergerNode createChannelMerger(int numberOfInputs)
public @JsNonNull ChannelMergerNode createChannelMerger()
public @JsNonNull ChannelSplitterNode createChannelSplitter(int numberOfOutputs)
public @JsNonNull ChannelSplitterNode createChannelSplitter()
public @JsNonNull ConstantSourceNode createConstantSource()
public @JsNonNull ConvolverNode createConvolver()
public @JsNonNull DelayNode createDelay(double maxDelayTime)
public @JsNonNull DelayNode createDelay()
public @JsNonNull DynamicsCompressorNode createDynamicsCompressor()
public @JsNonNull GainNode createGain()
public @JsNonNull IIRFilterNode createIIRFilter(@Nonnull JsArray<java.lang.Double> feedforward, @Nonnull JsArray<java.lang.Double> feedback)
@JsOverlay public final @JsNonNull IIRFilterNode createIIRFilter(@Nonnull double[] feedforward, @Nonnull double... feedback)
public @JsNonNull OscillatorNode createOscillator()
public @JsNonNull PannerNode createPanner()
public @JsNonNull PeriodicWave createPeriodicWave(@Nonnull JsArray<java.lang.Double> real, @Nonnull JsArray<java.lang.Double> imag, @Nonnull PeriodicWaveConstraints constraints)
public @JsNonNull PeriodicWave createPeriodicWave(@Nonnull double[] real, @Nonnull double[] imag, @Nonnull PeriodicWaveConstraints constraints)
public @JsNonNull PeriodicWave createPeriodicWave(@Nonnull JsArray<java.lang.Double> real, @Nonnull JsArray<java.lang.Double> imag)
@JsOverlay public final @JsNonNull PeriodicWave createPeriodicWave(@Nonnull double[] real, @Nonnull double... imag)
public @JsNonNull ScriptProcessorNode createScriptProcessor(int bufferSize, int numberOfInputChannels, int numberOfOutputChannels)
public @JsNonNull ScriptProcessorNode createScriptProcessor(int bufferSize, int numberOfInputChannels)
public @JsNonNull ScriptProcessorNode createScriptProcessor(int bufferSize)
public @JsNonNull ScriptProcessorNode createScriptProcessor()
public @JsNonNull StereoPannerNode createStereoPanner()
public @JsNonNull WaveShaperNode createWaveShaper()
public @JsNonNull Promise<AudioBuffer> decodeAudioData(@Nonnull ArrayBuffer audioData, @Nullable DecodeSuccessCallback successCallback, @Nullable DecodeErrorCallback errorCallback)
public @JsNonNull Promise<AudioBuffer> decodeAudioData(@Nonnull ArrayBuffer audioData, @Nullable DecodeSuccessCallback successCallback)
public @JsNonNull Promise<AudioBuffer> decodeAudioData(@Nonnull ArrayBuffer audioData)
@JsOverlay
public final void addStatechangeListener(@Nonnull
EventListener callback,
@Nonnull
AddEventListenerOptions options)
@JsOverlay
public final void addStatechangeListener(@Nonnull
EventListener callback,
boolean useCapture)
@JsOverlay
public final void addStatechangeListener(@Nonnull
EventListener callback)
@JsOverlay
public final void removeStatechangeListener(@Nonnull
EventListener callback,
@Nonnull
EventListenerOptions options)
@JsOverlay
public final void removeStatechangeListener(@Nonnull
EventListener callback,
boolean useCapture)
@JsOverlay
public final void removeStatechangeListener(@Nonnull
EventListener callback)