Please note, this is a STATIC archive of website developer.mozilla.org from 03 Nov 2016, cach3.com does not collect or store any user information, there is no "phishing" involved.

Revision 912843 of AudioContext

  • Revision slug: Web/API/AudioContext
  • Revision title: AudioContext
  • Revision id: 912843
  • Created:
  • Creator: chrisdavidmills
  • Is current revision? No
  • Comment

Revision Content

{{APIRef("Web Audio API")}}

The AudioContext interface represents an audio-processing graph built from audio modules linked together, each represented by an {{domxref("AudioNode")}}. An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an AudioContext before you do anything else, as everything happens inside a context.

An AudioContext can be a target of events, therefore it implements the {{domxref("EventTarget")}} interface.

Properties

{{domxref("AudioContext.currentTime")}} {{readonlyInline}}
Returns a double representing an ever-increasing hardware time in seconds used for scheduling. It starts at 0.
{{domxref("AudioContext.destination")}} {{readonlyInline}}
Returns an {{domxref("AudioDestinationNode")}} representing the final destination of all audio in the context. It can be thought of as the audio-rendering device.
{{domxref("AudioContext.listener")}} {{readonlyInline}}
Returns the {{domxref("AudioListener")}} object, used for 3D spatialization.
{{domxref("AudioContext.sampleRate")}} {{readonlyInline}}
Returns a float representing the sample rate (in samples per second) used by all nodes in this context. The sample-rate of an {{domxref("AudioContext")}} cannot be changed.
{{domxref("AudioContext.state")}} {{readonlyInline}}
Returns the current state of the AudioContext.
{{domxref("AudioContext.mozAudioChannelType")}} {{ non-standard_inline() }} {{readonlyInline}}
Used to return the audio channel that the sound playing in an {{domxref("AudioContext")}} will play in, on a Firefox OS device.

Event handlers

{{domxref("AudioContext.onstatechange")}}
An event handler that runs when an event of type {{event("statechange")}} has fired. This occurs when the AudioContext's state changes, due to the calling of one of the state change methods ({{domxref("AudioContext.suspend")}}, {{domxref("AudioContext.resume")}}, or {{domxref("AudioContext.close")}}.)

Methods

Also implements methods from the interface {{domxref("EventTarget")}}.

{{domxref("AudioContext.close()")}}
Closes the audio context, releasing any system audio resources that it uses.
{{domxref("AudioContext.createBuffer()")}}
Creates a new, empty {{ domxref("AudioBuffer") }} object, which can then be populated by data and played via an {{ domxref("AudioBufferSourceNode") }}.
{{domxref("AudioContext.createBufferSource()")}}
Creates an {{domxref("AudioBufferSourceNode")}}, which can be used to play and manipulate audio data contained within an {{ domxref("AudioBuffer") }} object. {{ domxref("AudioBuffer") }}s are created using {{domxref("AudioContext.createBuffer")}} or returned by {{domxref("AudioContext.decodeAudioData")}} when it successfully decodes an audio track.
{{domxref("AudioContext.createMediaElementSource()")}}
Creates a {{domxref("MediaElementAudioSourceNode")}} associated with an {{domxref("HTMLMediaElement")}}. This can be used to play and manipulate audio from {{HTMLElement("video")}} or {{HTMLElement("audio")}} elements.
{{domxref("AudioContext.createMediaStreamSource()")}}
Creates a {{domxref("MediaStreamAudioSourceNode")}} associated with a {{domxref("MediaStream")}} representing an audio stream which may come from the local computer microphone or other sources.
{{domxref("AudioContext.createMediaStreamDestination()")}}
Creates a {{domxref("MediaStreamAudioDestinationNode")}} associated with a {{domxref("MediaStream")}} representing an audio stream which may be stored in a local file or sent to another computer.
{{domxref("AudioContext.createScriptProcessor()")}}
Creates a {{domxref("ScriptProcessorNode")}}, which can be used for direct audio processing via JavaScript.
{{domxref("AudioContext.createStereoPanner()")}}
Creates a {{domxref("StereoPannerNode")}}, which can be used to apply stereo panning to an audio source.
{{domxref("AudioContext.createAnalyser()")}}
Creates an {{domxref("AnalyserNode")}}, which can be used to expose audio time and frequency data and for example to create data visualisations.
{{domxref("AudioContext.createBiquadFilter()")}}
Creates a {{domxref("BiquadFilterNode")}}, which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etc.
{{domxref("AudioContext.createChannelMerger()")}}
Creates a {{domxref("ChannelMergerNode")}}, which is used to combine channels from multiple audio streams into a single audio stream.
{{domxref("AudioContext.createChannelSplitter()")}}
Creates a {{domxref("ChannelSplitterNode")}}, which is used to access the individual channels of an audio stream and process them separately.
{{domxref("AudioContext.createConvolver()")}}
Creates a {{domxref("ConvolverNode")}}, which can be used to apply convolution effects to your audio graph, for example a reverberation effect.
{{domxref("AudioContext.createDelay()")}}
Creates a {{domxref("DelayNode")}}, which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph.
{{domxref("AudioContext.createDynamicsCompressor()")}}
Creates a {{domxref("DynamicsCompressorNode")}}, which can be used to apply acoustic compression to an audio signal.
{{domxref("AudioContext.createGain()")}}
Creates a {{domxref("GainNode")}}, which can be used to control the overall volume of the audio graph.
{{domxref("AudioContext.createOscillator()")}}
Creates an {{domxref("OscillatorNode")}}, a source representing a periodic waveform. It basically generates a tone.
{{domxref("AudioContext.createPanner()")}}
Creates a {{domxref("PannerNode")}}, which is used to spatialise an incoming audio stream in 3D space.
{{domxref("AudioContext.createPeriodicWave()")}}
Creates a {{domxref("PeriodicWave")}}, used to define a periodic waveform that can be used to determine the output of an {{ domxref("OscillatorNode") }}.
{{domxref("AudioContext.createWaveShaper()")}}
Creates a {{domxref("WaveShaperNode")}}, which is used to implement non-linear distortion effects.
{{domxref("AudioContext.createAudioWorker()")}}
Creates an {{domxref("AudioWorkerNode")}}, which can interact with a web worker thread to generate, process, or analyse audio directly. This was added to the spec on August 29 2014, and is not implemented in any browser yet.
{{domxref("AudioContext.decodeAudioData()")}}
Asynchronously decodes audio file data contained in an {{domxref("ArrayBuffer")}}. In this case, the ArrayBuffer is usually loaded from an {{domxref("XMLHttpRequest")}}'s response attribute after setting the responseType to arraybuffer. This method only works on complete files, not fragments of audio files.
{{domxref("AudioContext.resume()")}}
Resumes the progression of time in an audio context that has previously been suspended.
{{domxref("AudioContext.suspend()")}}
Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.

Obsolete methods

{{domxref("AudioContext.createJavaScriptNode()")}}
Creates a {{domxref("JavaScriptNode")}}, used for direct audio processing via JavaScript. This method is obsolete, and has been replaced by {{domxref("AudioContext.createScriptProcessor()")}}.
{{domxref("AudioContext.createWaveTable()")}}
Creates a {{domxref("WaveTableNode")}}, used to define a periodic waveform. This method is obsolete, and has been replaced by {{domxref("AudioContext.createPeriodicWave()")}}.

Examples

Basic audio context declaration:

var audioCtx = new AudioContext();

Cross browser variant:

var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext();

var oscillatorNode = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
var finish = audioCtx.destination;
// etc.

Specifications

Specification Status Comment
{{SpecName('Web Audio API', '#the-audiocontext-interface', 'AudioContext')}} {{Spec2('Web Audio API')}}  

Browser compatibility

{{CompatibilityTable}}
Feature Chrome Firefox (Gecko) Internet Explorer Opera Safari (WebKit)
Basic support {{CompatChrome(10.0)}}{{property_prefix("webkit")}}
35
{{CompatGeckoDesktop(25.0)}}  {{CompatNo}} 15.0{{property_prefix("webkit")}}
22
6.0{{property_prefix("webkit")}}
createStereoPanner() {{CompatChrome(42.0)}} {{CompatGeckoDesktop(37.0)}}  {{CompatNo}} {{CompatNo}} {{CompatNo}}
onstatechange, state, suspend(), resume() {{CompatVersionUnknown}} {{CompatGeckoDesktop(40.0)}} {{CompatNo}} {{CompatNo}} {{CompatNo}}
Feature Android Firefox Mobile (Gecko) Firefox OS IE Mobile Opera Mobile Safari Mobile Chrome for Android
Basic support {{CompatNo}} {{CompatGeckoDesktop(37.0)}}  2.2 {{CompatNo}} {{CompatNo}} {{CompatNo}} {{CompatVersionUnknown}}
createStereoPanner() {{CompatNo}} {{CompatVersionUnknown}} {{CompatVersionUnknown}} {{CompatNo}} {{CompatNo}} {{CompatNo}} {{CompatVersionUnknown}}
onstatechange, state, suspend(), resume() {{CompatNo}} {{CompatVersionUnknown}} {{CompatVersionUnknown}} {{CompatNo}} {{CompatNo}} {{CompatNo}} {{CompatVersionUnknown}}

See also

Revision Source

<p>{{APIRef("Web Audio API")}}</p>

<div>
<p>The <code>AudioContext</code> interface represents an audio-processing graph built from audio modules linked together, each represented by an {{domxref("AudioNode")}}. An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an AudioContext before you do anything else, as everything happens inside a context.</p>
</div>

<p>An <code>AudioContext</code> can be a target of events, therefore it implements the {{domxref("EventTarget")}} interface.</p>

<h2 id="Properties">Properties</h2>

<dl>
 <dt>{{domxref("AudioContext.currentTime")}} {{readonlyInline}}</dt>
 <dd>Returns a double representing an ever-increasing hardware time in seconds used for scheduling. It starts at <code>0</code>.</dd>
 <dt>{{domxref("AudioContext.destination")}} {{readonlyInline}}</dt>
 <dd>Returns an {{domxref("AudioDestinationNode")}} representing the final destination of all audio in the context. It can be thought of as the audio-rendering device.</dd>
 <dt>{{domxref("AudioContext.listener")}} {{readonlyInline}}</dt>
 <dd>Returns the {{domxref("AudioListener")}} object, used for 3D spatialization.</dd>
 <dt>{{domxref("AudioContext.sampleRate")}} {{readonlyInline}}</dt>
 <dd>Returns a float representing the sample rate (in samples per second) used by all nodes in this context. The sample-rate of an {{domxref("AudioContext")}} cannot be changed.</dd>
 <dt>{{domxref("AudioContext.state")}} {{readonlyInline}}</dt>
 <dd>Returns the current state of the <code>AudioContext</code>.</dd>
 <dt>{{domxref("AudioContext.mozAudioChannelType")}} {{ non-standard_inline() }} {{readonlyInline}}</dt>
 <dd>Used to return the audio channel that the sound playing in an {{domxref("AudioContext")}} will play in, on a Firefox OS device.</dd>
</dl>

<h3 id="Event_handlers">Event handlers</h3>

<dl>
 <dt>{{domxref("AudioContext.onstatechange")}}</dt>
 <dd>An event handler that runs when an event of type {{event("statechange")}} has fired. This occurs when the <code>AudioContext</code>'s state changes, due to the calling of one of the state change methods ({{domxref("AudioContext.suspend")}}, {{domxref("AudioContext.resume")}}, or {{domxref("AudioContext.close")}}.)</dd>
</dl>

<h2 id="Methods">Methods</h2>

<p><em>Also implements methods from the interface </em>{{domxref("EventTarget")}}.</p>

<dl>
 <dt>{{domxref("AudioContext.close()")}}</dt>
 <dd>Closes the audio context, releasing any system audio resources that it uses.</dd>
 <dt>{{domxref("AudioContext.createBuffer()")}}</dt>
 <dd>Creates a new, empty {{ domxref("AudioBuffer") }} object, which can then be populated by data and played via an {{ domxref("AudioBufferSourceNode") }}.</dd>
 <dt>{{domxref("AudioContext.createBufferSource()")}}</dt>
 <dd>Creates an {{domxref("AudioBufferSourceNode")}}, which can be used to play and manipulate audio data contained within an {{ domxref("AudioBuffer") }} object. {{ domxref("AudioBuffer") }}s are created using {{domxref("AudioContext.createBuffer")}} or returned by {{domxref("AudioContext.decodeAudioData")}} when it successfully decodes an audio track.</dd>
 <dt>{{domxref("AudioContext.createMediaElementSource()")}}</dt>
 <dd>Creates a {{domxref("MediaElementAudioSourceNode")}} associated with an {{domxref("HTMLMediaElement")}}. This can be used to play and manipulate audio from {{HTMLElement("video")}} or {{HTMLElement("audio")}} elements.</dd>
 <dt>{{domxref("AudioContext.createMediaStreamSource()")}}</dt>
 <dd>Creates a {{domxref("MediaStreamAudioSourceNode")}} associated with a {{domxref("MediaStream")}} representing an audio stream which may come from the local computer microphone or other sources.</dd>
 <dt>{{domxref("AudioContext.createMediaStreamDestination()")}}</dt>
 <dd>Creates a {{domxref("MediaStreamAudioDestinationNode")}} associated with a {{domxref("MediaStream")}} representing an audio stream which may be stored in a local file or sent to another computer.</dd>
 <dt>{{domxref("AudioContext.createScriptProcessor()")}}</dt>
 <dd>Creates a {{domxref("ScriptProcessorNode")}}, which can be used for direct audio processing via JavaScript.</dd>
 <dt>{{domxref("AudioContext.createStereoPanner()")}}</dt>
 <dd>Creates a {{domxref("StereoPannerNode")}}, which can be used to apply stereo panning to an audio source.</dd>
 <dt>{{domxref("AudioContext.createAnalyser()")}}</dt>
 <dd>Creates an {{domxref("AnalyserNode")}}, which can be used to expose audio time and frequency data and for example to create data visualisations.</dd>
 <dt>{{domxref("AudioContext.createBiquadFilter()")}}</dt>
 <dd>Creates a {{domxref("BiquadFilterNode")}}, which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etc.</dd>
 <dt>{{domxref("AudioContext.createChannelMerger()")}}</dt>
 <dd>Creates a {{domxref("ChannelMergerNode")}}, which is used to combine channels from multiple audio streams into a single audio stream.</dd>
 <dt>{{domxref("AudioContext.createChannelSplitter()")}}</dt>
 <dd>Creates a {{domxref("ChannelSplitterNode")}}, which is used to access the individual channels of an audio stream and process them separately.</dd>
 <dt>{{domxref("AudioContext.createConvolver()")}}</dt>
 <dd>Creates a {{domxref("ConvolverNode")}}, which can be used to apply convolution effects to your audio graph, for example a reverberation effect.</dd>
 <dt>{{domxref("AudioContext.createDelay()")}}</dt>
 <dd>Creates a {{domxref("DelayNode")}}, which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph.</dd>
 <dt>{{domxref("AudioContext.createDynamicsCompressor()")}}</dt>
 <dd>Creates a {{domxref("DynamicsCompressorNode")}}, which can be used to apply acoustic compression to an audio signal.</dd>
 <dt>{{domxref("AudioContext.createGain()")}}</dt>
 <dd>Creates a {{domxref("GainNode")}}, which can be used to control the overall volume of the audio graph.</dd>
 <dt>{{domxref("AudioContext.createOscillator()")}}</dt>
 <dd>Creates an {{domxref("OscillatorNode")}}, a source representing a periodic waveform. It basically generates a tone.</dd>
 <dt>{{domxref("AudioContext.createPanner()")}}</dt>
 <dd>Creates a {{domxref("PannerNode")}}, which is used to spatialise an incoming audio stream in 3D space.</dd>
 <dt>{{domxref("AudioContext.createPeriodicWave()")}}</dt>
 <dd>Creates a {{domxref("PeriodicWave")}}, used to define a periodic waveform that can be used to determine the output of an {{ domxref("OscillatorNode") }}.</dd>
 <dt>{{domxref("AudioContext.createWaveShaper()")}}</dt>
 <dd>Creates a {{domxref("WaveShaperNode")}}, which is used to implement non-linear distortion effects.</dd>
 <dt>{{domxref("AudioContext.createAudioWorker()")}}</dt>
 <dd>Creates an {{domxref("AudioWorkerNode")}}, which can interact with a web worker thread to generate, process, or analyse audio directly. This was added to the spec on August 29 2014, and is not implemented in any browser yet.</dd>
 <dt>{{domxref("AudioContext.decodeAudioData()")}}</dt>
 <dd>Asynchronously decodes audio file data contained in an {{domxref("ArrayBuffer")}}. In this case, the ArrayBuffer is usually loaded from an {{domxref("XMLHttpRequest")}}'s <code>response</code> attribute after setting the <code>responseType</code> to <code>arraybuffer</code>. This method only works on complete files, not fragments of audio files.</dd>
 <dt>{{domxref("AudioContext.resume()")}}</dt>
 <dd>Resumes the progression of time in an audio context that has previously been suspended.</dd>
 <dt>{{domxref("AudioContext.suspend()")}}</dt>
 <dd>Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.</dd>
</dl>

<h2 id="Obsolete_methods">Obsolete methods</h2>

<dl>
 <dt>{{domxref("AudioContext.createJavaScriptNode()")}}</dt>
 <dd>Creates a {{domxref("JavaScriptNode")}}, used for direct audio processing via JavaScript. This method is obsolete, and has been replaced by {{domxref("AudioContext.createScriptProcessor()")}}.</dd>
 <dt>{{domxref("AudioContext.createWaveTable()")}}</dt>
 <dd>Creates a {{domxref("WaveTableNode")}}, used to define a periodic waveform. This method is obsolete, and has been replaced by {{domxref("AudioContext.createPeriodicWave()")}}.</dd>
</dl>

<h2 id="Examples">Examples</h2>

<p>Basic audio context declaration:</p>

<pre class="brush: js">
var audioCtx = new AudioContext();</pre>

<p>Cross browser variant:</p>

<pre class="brush: js">
var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext();

var oscillatorNode = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
var finish = audioCtx.destination;
// etc.</pre>

<h2 id="Specifications">Specifications</h2>

<table class="standard-table">
 <tbody>
  <tr>
   <th scope="col">Specification</th>
   <th scope="col">Status</th>
   <th scope="col">Comment</th>
  </tr>
  <tr>
   <td>{{SpecName('Web Audio API', '#the-audiocontext-interface', 'AudioContext')}}</td>
   <td>{{Spec2('Web Audio API')}}</td>
   <td>&nbsp;</td>
  </tr>
 </tbody>
</table>

<h2 id="Browser_compatibility">Browser compatibility</h2>

<div>{{CompatibilityTable}}</div>

<div id="compat-desktop">
<table class="compat-table">
 <tbody>
  <tr>
   <th>Feature</th>
   <th>Chrome</th>
   <th>Firefox (Gecko)</th>
   <th>Internet Explorer</th>
   <th>Opera</th>
   <th>Safari (WebKit)</th>
  </tr>
  <tr>
   <td>Basic support</td>
   <td>{{CompatChrome(10.0)}}{{property_prefix("webkit")}}<br />
    35</td>
   <td>{{CompatGeckoDesktop(25.0)}}&nbsp;</td>
   <td>{{CompatNo}}</td>
   <td>15.0{{property_prefix("webkit")}}<br />
    22</td>
   <td>6.0{{property_prefix("webkit")}}</td>
  </tr>
  <tr>
   <td><code>createStereoPanner()</code></td>
   <td>{{CompatChrome(42.0)}}</td>
   <td>{{CompatGeckoDesktop(37.0)}}&nbsp;</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
  </tr>
  <tr>
   <td><code>onstatechange</code>, <code>state</code>, <code>suspend()</code>, <code>resume()</code></td>
   <td>{{CompatVersionUnknown}}</td>
   <td>{{CompatGeckoDesktop(40.0)}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
  </tr>
 </tbody>
</table>
</div>

<div id="compat-mobile">
<table class="compat-table">
 <tbody>
  <tr>
   <th>Feature</th>
   <th>Android</th>
   <th>Firefox Mobile (Gecko)</th>
   <th>Firefox OS</th>
   <th>IE Mobile</th>
   <th>Opera Mobile</th>
   <th>Safari Mobile</th>
   <th>Chrome for Android</th>
  </tr>
  <tr>
   <td>Basic support</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatGeckoDesktop(37.0)}}&nbsp;</td>
   <td>2.2</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatVersionUnknown}}</td>
  </tr>
  <tr>
   <td><code>createStereoPanner()</code></td>
   <td>{{CompatNo}}</td>
   <td>{{CompatVersionUnknown}}</td>
   <td>{{CompatVersionUnknown}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatVersionUnknown}}</td>
  </tr>
  <tr>
   <td><code>onstatechange</code>, <code>state</code>, <code>suspend()</code>, <code>resume()</code></td>
   <td>{{CompatNo}}</td>
   <td>{{CompatVersionUnknown}}</td>
   <td>{{CompatVersionUnknown}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatNo}}</td>
   <td>{{CompatVersionUnknown}}</td>
  </tr>
 </tbody>
</table>
</div>

<h2 id="See_also">See also</h2>

<ul style="margin-left: 40px;">
 <li><a href="/en-US/docs/Web_Audio_API/Using_Web_Audio_API">Using the Web Audio API</a></li>
 <li>{{domxref("OfflineAudioContext")}}</li>
</ul>
Revert to this revision