Please note, this is a STATIC archive of website developer.mozilla.org from 03 Nov 2016, cach3.com does not collect or store any user information, there is no "phishing" involved.

MediaStreamAudioSourceNode

이 문서는 아직 자원 봉사자들이 한국어로 번역하지 않았습니다. 함께 해서 번역을 마치도록 도와 주세요!

The MediaStreamAudioSourceNode interface represents an audio source consisting of a WebRTC MediaStream (such as a webcam or microphone.) It is an AudioNode that acts as an audio source.

A MediaElementSourceNode has no inputs and exactly one output, and is created using the AudioContext.createMediaStreamSource method. The amount of channels in the output equals the number of channels in AudioMediaStreamTrack. If there is no valid media stream, then the number of output channels will be one silent channel.

Number of inputs 0
Number of outputs 1
Channel count defined by the AudioMediaStreamTrack passed to the AudioContext.createMediaStreamSource method that created it.

Properties

Inherits properties from its parent, AudioNode.

Methods

Inherits methods from its parent, AudioNode.

Example

In this example, we grab a media (audio + video) stream from navigator.getUserMedia, feed the media into a <video> element to play then mute the audio, but then also feed the audio into a MediaStreamAudioSourceNode. Next, we feed this source audio into a low pass BiquadFilterNode (which effectively serves as a bass booster), then a AudioDestinationNode.

The range slider below the <video> element controls the amount of gain given to the lowpass filter — increase the value of the slider to make the audio sound more bass heavy!

Note: You can see this example running live, or view the source.

// fork getUserMedia for multiple browser versions, for those
// that need prefixes

navigator.getUserMedia = (navigator.getUserMedia ||
                          navigator.webkitGetUserMedia ||
                          navigator.mozGetUserMedia ||
                          navigator.msGetUserMedia);

// define other variables

var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var myAudio = document.querySelector('audio');
var pre = document.querySelector('pre');
var video = document.querySelector('video');
var myScript = document.querySelector('script');
var range = document.querySelector('input');

// Create variables to store mouse pointer Y coordinate
// and HEIGHT of screen
var CurY;
var HEIGHT = window.innerHeight;

// getUserMedia block - grab stream
// put it into a MediaStreamAudioSourceNode
// also output the visuals into a video element

if (navigator.getUserMedia) {
   console.log('getUserMedia supported.');
   navigator.getUserMedia (
      // constraints: audio and video for this app
      {
         audio: true,
         video: true
      },

      // Success callback
      function(stream) {
         video.src = (window.URL && window.URL.createObjectURL(stream)) || stream;
         video.onloadedmetadata = function(e) {
            video.play();
            video.muted = 'true';
         };

         // Create a MediaStreamAudioSourceNode
         // Feed the HTMLMediaElement into it
         var source = audioCtx.createMediaStreamSource(stream);

          // Create a biquadfilter
          var biquadFilter = audioCtx.createBiquadFilter();
          biquadFilter.type = "lowshelf";
          biquadFilter.frequency.value = 1000;
          biquadFilter.gain.value = range.value;

          // connect the AudioBufferSourceNode to the gainNode
          // and the gainNode to the destination, so we can play the
          // music and adjust the volume using the mouse cursor
          source.connect(biquadFilter);
          biquadFilter.connect(audioCtx.destination);

          // Get new mouse pointer coordinates when mouse is moved
          // then set new gain value

          range.oninput = function() {
              biquadFilter.gain.value = range.value;
          }

      },

      // Error callback
      function(err) {
         console.log('The following gUM error occured: ' + err);
      }
   );
} else {
   console.log('getUserMedia not supported on your browser!');
}

// dump script to pre element

pre.innerHTML = myScript.innerHTML;

Note: As a consequence of calling createMediaStreamSource(), audio playback from the media stream will be re-routed into the processing graph of the AudioContext. So playing/pausing the stream can still be done through the media element API and the player controls.

 

Specification

Specification Status Comment
Web Audio API
The definition of 'MediaStreamAudioSourceNode' in that specification.
Working Draft  

Browser compatibility

Feature Chrome Firefox (Gecko) Internet Explorer Opera Safari (WebKit)
Basic support 14 webkit 25 (25) Not supported 15 webkit
22 (unprefixed)
6 webkit
Feature Android Chrome Firefox Mobile (Gecko) Firefox OS IE Phone Opera Mobile Safari Mobile
Basic support Not supported 28 webkit 25.0 (25) 1.2 Not supported Not supported webkit

See also

문서 태그 및 공헌자

 이 페이지의 공헌자: Sebastianz, fscholz, teoli, chrisdavidmills, mristic
 최종 변경: Sebastianz,