The AudioContext
interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode
. An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an AudioContext before you do anything else, as everything happens inside a context.
<div id="interfaceDiagram" style="display: inline-block; position: relative; width: 100%; padding-bottom: 11.666666666666666%; vertical-align: middle; overflow: hidden;"><svg style="display: inline-block; position: absolute; top: 0; left: 0;" viewbox="-50 0 600 70" preserveAspectRatio="xMinYMin meet"><a xlink:href="https://developer.mozilla.org/en-US/docs/Web/API/EventTarget" target="_top"><rect x="1" y="1" width="110" height="50" fill="#fff" stroke="#D4DDE4" stroke-width="2px" /><text x="56" y="30" font-size="12px" font-family="Consolas,Monaco,Andale Mono,monospace" fill="#4D4E53" text-anchor="middle" alignment-baseline="middle">EventTarget</text></a><polyline points="111,25 121,20 121,30 111,25" stroke="#D4DDE4" fill="none"/><line x1="121" y1="25" x2="151" y2="25" stroke="#D4DDE4"/><a xlink:href="https://developer.mozilla.org/en-US/docs/Web/API/AudioContext" target="_top"><rect x="151" y="1" width="120" height="50" fill="#F4F7F8" stroke="#D4DDE4" stroke-width="2px" /><text x="211" y="30" font-size="12px" font-family="Consolas,Monaco,Andale Mono,monospace" fill="#4D4E53" text-anchor="middle" alignment-baseline="middle">AudioContext</text></a></svg></div>
a:hover text { fill: #0095DD; pointer-events: all;}
Constructor
AudioContext()
- Creates and returns a new
AudioContext
object.
Properties
Also inherits properties from its parent interface, BaseAudioContext
.
AudioContext.baseLatency
Read only- Returns the number of seconds of processing latency incurred by the
AudioContext
passing the audio from theAudioDestinationNode
to the audio subsystem. AudioContext.outputLatency
Read only- Returns an estimation of the output latency of the current audio context.
Methods
Also inherits methods from its parent interface, BaseAudioContext
.
AudioContext.close()
- Closes the audio context, releasing any system audio resources that it uses.
AudioContext.createMediaElementSource()
- Creates a
MediaElementAudioSourceNode
associated with anHTMLMediaElement
. This can be used to play and manipulate audio from<video>
or<audio>
elements. AudioContext.createMediaStreamSource()
- Creates a
MediaStreamAudioSourceNode
associated with aMediaStream
representing an audio stream which may come from the local computer microphone or other sources. AudioContext.createMediaStreamDestination()
- Creates a
MediaStreamAudioDestinationNode
associated with aMediaStream
representing an audio stream which may be stored in a local file or sent to another computer. AudioContext.createMediaStreamTrackSource()
- Creates a
MediaStreamTrackAudioSourceNode
associated with aMediaStream
representing an media stream track. AudioContext.getOutputTimestamp()
- Returns a new
AudioTimestamp
object containing two correlated context's audio stream position values. AudioContext.resume()
- Resumes the progression of time in an audio context that has previously been suspended.
AudioContext.suspend()
- Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.
Examples
Basic audio context declaration:
var audioCtx = new AudioContext();
Cross browser variant:
var AudioContext = window.AudioContext || window.webkitAudioContext; var audioCtx = new AudioContext(); var oscillatorNode = audioCtx.createOscillator(); var gainNode = audioCtx.createGain(); var finish = audioCtx.destination; // etc.
Specifications
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'AudioContext' in that specification. |
Working Draft |
Browser compatibility
Feature | Chrome | Edge | Firefox | Internet Explorer | Opera | Safari |
---|---|---|---|---|---|---|
Basic support | 35 14 — 57 webkit | Yes | 25 | No | 22 15 — 44 webkit | 6 webkit |
AudioContext() constructor | 55 | Yes | 25 | No | 42 | Yes webkit |
baseLatency | 60 | ? | No | No | 47 | No |
outputLatency | Yes | ? | No | No | Yes | No |
close | 43 | ? | 40 | No | Yes | ? |
createMediaElementSource | 14 | Yes | 25 | No | 15 | 6 |
createMediaStreamSource | 14 | Yes | 25 | No | 15 | 6 |
createMediaStreamDestination | 14 | Yes | 25 | No | 15 | 6 |
createMediaStreamTrackSource | ? | ? | No | No | ? | No |
getOutputTimestamp | 57 | ? | No | No | 44 | No |
suspend | 43 | ? | 40 | No | Yes | ? |
Feature | Android webview | Chrome for Android | Edge mobile | Firefox for Android | IE mobile | Opera Android | iOS Safari |
---|---|---|---|---|---|---|---|
Basic support | Yes | 35 14 — 57 webkit | Yes | 26 | No | 22 15 — 44 webkit | ? |
AudioContext() constructor | 55 | 55 | ? | 25 | No | 42 | ? |
baseLatency | 60 | 60 | ? | No | No | 47 | No |
outputLatency | Yes | Yes | ? | No | No | Yes | ? |
close | 43 | 43 | ? | 40 | No | Yes | ? |
createMediaElementSource | Yes | 14 | Yes | 26 | No | 15 | ? |
createMediaStreamSource | Yes | 14 | Yes | 26 | No | 15 | ? |
createMediaStreamDestination | Yes | 14 | Yes | 26 | No | 15 | ? |
createMediaStreamTrackSource | ? | ? | ? | No | No | ? | No |
getOutputTimestamp | 57 | 57 | ? | No | No | 44 | No |
suspend | 43 | 43 | ? | 40 | No | Yes | ? |