The BaseAudioContext
interface acts as a base definition for online and offline audio-processing graphs, as represented by AudioContext
and OfflineAudioContext
respectively. You wouldn't use BaseAudioContext
directly — you'd use its features via one of these two inheriting interfaces.
A BaseAudioContext
can be a target of events, therefore it implements the EventTarget
interface.
BaseAudioContext.audioWorklet
Read only
AudioWorklet
object, which can be used to create and manage AudioNode
s in which JavaScript code implementing the AudioWorkletProcessor
interface are run in the background to process audio data.BaseAudioContext.currentTime
Read only
0
.BaseAudioContext.destination
Read only
AudioDestinationNode
representing the final destination of all audio in the context. It can be thought of as the audio-rendering device.BaseAudioContext.listener
Read only
AudioListener
object, used for 3D spatialization.BaseAudioContext.sampleRate
Read only
AudioContext
cannot be changed.BaseAudioContext.state
Read only
AudioContext
.BaseAudioContext.onstatechange
statechange
has fired. This occurs when the AudioContext
's state changes, due to the calling of one of the state change methods (AudioContext.suspend
, AudioContext.resume
, or AudioContext.close
).Also implements methods from the interface EventTarget
.
BaseAudioContext.createAnalyser()
AnalyserNode
, which can be used to expose audio time and frequency data and for example to create data visualisations.BaseAudioContext.createBiquadFilter()
BiquadFilterNode
, which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etcBaseAudioContext.createBuffer()
AudioBuffer
object, which can then be populated by data and played via an AudioBufferSourceNode
.BaseAudioContext.createBufferSource()
AudioBufferSourceNode
, which can be used to play and manipulate audio data contained within an AudioBuffer
object. AudioBuffer
s are created using AudioContext.createBuffer
or returned by AudioContext.decodeAudioData
when it successfully decodes an audio track.BaseAudioContext.createConstantSource()
ConstantSourceNode
object, which is an audio source that continuously outputs a monaural (one-channel) sound signal whose samples all have the same value.BaseAudioContext.createChannelMerger()
ChannelMergerNode
, which is used to combine channels from multiple audio streams into a single audio stream.BaseAudioContext.createChannelSplitter()
ChannelSplitterNode
, which is used to access the individual channels of an audio stream and process them separately.BaseAudioContext.createConvolver()
ConvolverNode
, which can be used to apply convolution effects to your audio graph, for example a reverberation effect.BaseAudioContext.createDelay()
DelayNode
, which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph.BaseAudioContext.createDynamicsCompressor()
DynamicsCompressorNode
, which can be used to apply acoustic compression to an audio signal.BaseAudioContext.createGain()
GainNode
, which can be used to control the overall volume of the audio graph.BaseAudioContext.createIIRFilter()
IIRFilterNode
, which represents a second order filter configurable as several different common filter types.BaseAudioContext.createOscillator()
OscillatorNode
, a source representing a periodic waveform. It basically generates a tone.BaseAudioContext.createPanner()
PannerNode
, which is used to spatialise an incoming audio stream in 3D space.BaseAudioContext.createPeriodicWave()
PeriodicWave
, used to define a periodic waveform that can be used to determine the output of an OscillatorNode
.BaseAudioContext.createScriptProcessor()
ScriptProcessorNode
, which can be used for direct audio processing via JavaScript.BaseAudioContext.createStereoPanner()
StereoPannerNode
, which can be used to apply stereo panning to an audio source.BaseAudioContext.createWaveShaper()
WaveShaperNode
, which is used to implement non-linear distortion effects.BaseAudioContext.decodeAudioData()
ArrayBuffer
. In this case, the ArrayBuffer is usually loaded from an XMLHttpRequest
's response
attribute after setting the responseType
to arraybuffer
. This method only works on complete files, not fragments of audio files.BaseAudioContext.resume()
Basic audio context declaration:
var audioCtx = new AudioContext();
Cross browser variant:
var AudioContext = window.AudioContext || window.webkitAudioContext; var audioCtx = new AudioContext(); var oscillatorNode = audioCtx.createOscillator(); var gainNode = audioCtx.createGain(); var finish = audioCtx.destination; // etc.
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'BaseAudioContext' in that specification. | Working Draft |
Desktop | ||||||
---|---|---|---|---|---|---|
Chrome | Edge | Firefox | Internet Explorer | Opera | Safari | |
Basic support | Yes | Yes | Yes | No | 22
|
6
|
audioWorklet
|
? | ? | ? | ? | ? | ? |
createAnalyser |
10
|
Yes | 25 | No | 22
|
6
|
createBiquadFilter |
10
|
Yes | 25 | No | 22
|
6
|
createBuffer |
10
|
Yes | 25 | No | 22
|
6
|
createBufferSource |
10
|
Yes | 25 | No | 22
|
6
|
createChannelMerger |
10
|
Yes | 25 | No | 22
|
6
|
createChannelSplitter |
10
|
Yes | 25 | No | 22
|
6
|
createConstantSource |
56 | ? | Yes | No | 43 | ? |
createConvolver |
10
|
Yes | 25 | No | 22
|
6
|
createDelay |
10
|
Yes | 25 | No | 22
|
6
|
createDynamicsCompressor |
10
|
Yes | 25 | No | 22
|
6
|
createGain |
10
|
Yes | 25 | No | 22
|
6
|
createIIRFilter |
49 | Yes | 50 | ? | ? | ? |
createOscillator |
10
|
Yes | 25 | No | 22
|
6
|
createPanner |
10
|
Yes | 25 | No | 22
|
6
|
createPeriodicWave |
59
|
Yes | 25 | No | 22
|
6
|
createScriptProcessor |
10
|
Yes | 25 | No | 22
|
6
|
createStereoPanner |
42 | Yes | 37 | No | No | No |
createWaveShaper |
10
|
Yes | 25 | No | 22
|
6
|
currentTime |
10
|
Yes | 25 | No | 22
|
6
|
decodeAudioData |
10
|
Yes | 25 | No | 22
|
6
|
destination |
10
|
Yes | 25 | No | 22
|
6
|
listener |
10
|
Yes | 25 | No | 22
|
6
|
onstatechange |
43 | ? | 40 | No | ? | ? |
resume |
41 | ? | 40 | No | ? | ? |
sampleRate |
10
|
Yes | 25 | No | 22
|
6
|
state |
43 | ? | 40 | No | ? | ? |
Mobile | |||||||
---|---|---|---|---|---|---|---|
Android webview | Chrome for Android | Edge Mobile | Firefox for Android | Opera for Android | iOS Safari | Samsung Internet | |
Basic support | Yes | Yes | Yes | Yes | Yes | No | ? |
audioWorklet
|
? | ? | ? | ? | ? | ? | ? |
createAnalyser |
Yes | 33 | Yes | 26 | Yes | No | ? |
createBiquadFilter |
Yes | 33 | Yes | 26 | Yes | No | ? |
createBuffer |
Yes | 33 | Yes | 26 | Yes | No | ? |
createBufferSource |
Yes | 33 | Yes | 26 | Yes | No | ? |
createChannelMerger |
Yes | 33 | Yes | 26 | Yes | No | ? |
createChannelSplitter |
Yes | 33 | Yes | 26 | Yes | No | ? |
createConstantSource |
56 | 56 | ? | Yes | 43 | No | ? |
createConvolver |
Yes | 33 | Yes | 26 | Yes | No | ? |
createDelay |
Yes | 33 | Yes | 26 | Yes | No | ? |
createDynamicsCompressor |
Yes | 33 | Yes | 26 | Yes | No | ? |
createGain |
Yes | 33 | Yes | 26 | Yes | No | ? |
createIIRFilter |
49 | 49 | Yes | 50 | ? | ? | ? |
createOscillator |
Yes | 33 | Yes | 26 | Yes | No | ? |
createPanner |
Yes | 33 | Yes | 26 | Yes | No | ? |
createPeriodicWave |
59
|
59
|
Yes | 26 | Yes | No | ? |
createScriptProcessor |
Yes | 33 | Yes | 26 | Yes | No | ? |
createStereoPanner |
Yes | Yes | Yes | 37 | No | No | ? |
createWaveShaper |
Yes | 33 | Yes | 26 | Yes | No | ? |
currentTime |
Yes | 33 | Yes | 26 | Yes | No | ? |
decodeAudioData |
Yes | 33 | Yes | 26 | Yes | No | ? |
destination |
Yes | 33 | Yes | 26 | Yes | No | ? |
listener |
Yes | 33 | Yes | 26 | Yes | No | ? |
onstatechange |
? | ? | ? | ? | ? | ? | ? |
resume |
Yes | 41 | ? | Yes | ? | ? | ? |
sampleRate |
Yes | 33 | Yes | 26 | Yes | No | ? |
state |
? | ? | ? | ? | ? | ? | ? |
© 2005–2018 Mozilla Developer Network and individual contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext