The AudioNode
interface is a generic interface for representing an audio processing module. Examples include:
<audio>
or <video>
element, an OscillatorNode
, etc.),BiquadFilterNode
or ConvolverNode
), orGainNode
)Note: An AudioNode
can be target of events, therefore it implements the EventTarget
interface.
Each AudioNode
has inputs and outputs, and multiple audio nodes are connected to build a processing graph. This graph is contained in an AudioContext
, and each audio node can only belong to one audio context.
A source node has zero inputs but one or multiple outputs, and can be used to generate sound. On the other hand, a destination node has no outputs; instead, all its inputs are directly played back on the speakers (or whatever audio output device the audio context uses). In addition, there are processing nodes which have inputs and outputs. The exact processing done varies from one AudioNode
to another but, in general, a node reads its inputs, does some audio-related processing, and generates new values for its outputs, or simply lets the audio pass through (for example in the AnalyserNode
, where the result of the processing is accessed separately).
The more nodes in a graph, the higher the latency will be. For example, if your graph has a latency of 500ms, when the source node plays a sound, it will take half a second until that sound can be heard on your speakers (or even longer because of latency in the underlying audio device). Therefore, if you need to have interactive audio, keep the graph as small as possible, and put user-controlled audio nodes at the end of a graph. For example, a volume control (GainNode
) should be the last node so that volume changes take immediate effect.
Each input and output has a given amount of channels. For example, mono audio has one channel, while stereo audio has two channels. The Web Audio API will up-mix or down-mix the number of channels as required; check the Web Audio spec for details.
For a list of all audio nodes, see the Web Audio API homepage.
AudioNode
There are two ways to create an AudioNode
: via the constuctor and via the factory method.
// constructor const analyserNode = new AnalyserNode(audioCtx, { fftSize: 2048, maxDecibels: -25, minDecibels: -60, smoothingTimeConstant: 0.5, }); // factory method const analyserNode = audioCtx.createAnalyser(); analyserNode.fftSize = 2048; analyserNode.maxDecibels = -25; analyserNode.minDecibels = -60; analyserNode.smoothingTimeConstant = 0.5;
You are free to use either constructors or factory methods, or mix both, however there are advantages to using the constructors:
Keep in mind that Microsoft Edge does not yet appear to support the constructors; it will throw a "Function expected" error when you use the constructors.
Brief history: The first version of the Web Audio spec only defined the factory methods. After a design review in October 2013, it was decided to add constructors because they have numerous benefits over factory methods. The constructors were added to the spec from August to October 2016. Factory methods continue to be included in the spec and are not deprecated.
AudioNode.context
Read only
BaseAudioContext
, that is the object representing the processing graph the node is participating in.AudioNode.numberOfInputs
Read only
numberOfInputs
property with a value of 0
.AudioNode.numberOfOutputs
Read only
AudioDestinationNode
— have a value of 0
for this attribute.AudioNode.channelCount
AudioNode.channelCountMode
.AudioNode.channelCountMode
AudioNode.channelInterpretation
"speakers"
or "discrete"
.Also implements methods from the interface EventTarget
.
AudioNode.connect()
AudioParam
.AudioNode.disconnect()
This simple snippet of code shows the creation of some audio nodes, and how the AudioNode
properties and methods can be used. You can find examples of such usage on any of the examples linked to on the Web Audio API landing page (for example Violent Theremin.)
const audioCtx = new AudioContext(); const oscillator = new OscillatorNode(audioCtx); const gainNode = new GainNode(audioCtx); oscillator.connect(gainNode).connect(audioCtx.destination); oscillator.context; oscillator.numberOfInputs; oscillator.numberOfOutputs; oscillator.channelCount;
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'AudioNode' in that specification. | Working Draft |
Desktop | ||||||
---|---|---|---|---|---|---|
Chrome | Edge | Firefox | Internet Explorer | Opera | Safari | |
Basic support | 14 | Yes | 25 | No | 15 | 6 |
channelCount |
14 | 12 | 25 | No | 15 | 6 |
channelCountMode |
14 | 12 | 25 | No | 15 | 6 |
channelInterpretation |
14 | 12 | 25 | No | 15 | 6 |
context |
14 | 12 | 25 | No | 15 | 6 |
numberOfInputs |
14 | 12 | 25 | No | 15 | 6 |
numberOfOutputs |
14 | 12 | 25 | No | 15 | 6 |
connect |
14 | 12 | 25 | No | 15 | 6 |
disconnect |
14 | 12 | 25 | No | 15 | 6 |
Mobile | |||||||
---|---|---|---|---|---|---|---|
Android webview | Chrome for Android | Edge Mobile | Firefox for Android | Opera for Android | iOS Safari | Samsung Internet | |
Basic support | Yes | 18 | Yes | 26 | 15 | ? | Yes |
channelCount |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
channelCountMode |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
channelInterpretation |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
context |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
numberOfInputs |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
numberOfOutputs |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
connect |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
disconnect |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
© 2005–2018 Mozilla Developer Network and individual contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/AudioNode