Nossos voluntários ainda não traduziram este artigo para o Português (do Brasil) . Junte-se a nós e ajude a fazer o trabalho!
In this demonstration, we build upon the previous example by replacing our static textures with the frames of a playing Ogg video file. This is actually pretty easy to do, but is fun to look at, so let's get started. Similar code can be used to use any sort of data (such as a <canvas>
as the source for your textures.
Getting access to the video
The first step is to add the HTML to create the <video>
element that we'll use to retrieve the video frames:
<video id="video"> Your browser doesn't appear to support the HTML5 <code><video></code> element. </video>
This simply creates an element to play the video file "Firefox.ogv". We use CSS to keep this video element from being drawn:
video { display: none; }
Then we turn our attention to the JavaScript code, beginning by adding a line of code to the start()
function to fetch a reference to the video
element:
videoElement = document.getElementById("video");
And we replace the code that set up the interval-driven calls to drawScene()
with this:
videoElement.addEventListener("canplaythrough", startVideo, true); videoElement.addEventListener("ended", videoDone, true);
And finally we set the src
attribute to start loading the video. FIXME (bjacob): I also had to set preload="auto"
here or else it would never fire canplaythrough
in Firefox. In Chrome, it would load the video regardless of preload="auto"
.
video.preload = "auto"; videoElement.src = "Firefox.ogv";
The idea here is that we don't want to start the animation until enough of the video has been buffered that it can be played without interruption. So we add an event listener to wait for the video
element to tell us that it's buffered enough data that it believes the entire video can be played without pausing.
The startVideo()
function looks like this:
function startVideo() { videoElement.play(); intervalID = setInterval(drawScene, 15); }
This simply starts playing the video, then establishes the interval-driven calls to drawScene()
to handle rendering the cube.
We also add a second event listener on the video's "ended" event so that when the video is finished playing, we can stop the animation, since otherwise it's just chewing up processor time for no good reason.
function videoDone() { clearInterval(intervalID); }
The videoDone()
function simply calls window.clearInterval()
to end the calls to update the animation.
Using the video frames as a texture
The next change is to initTexture()
, which becomes much simpler, since it no longer needs to load an image file. Instead, all it does is create an empty texture object and set its filtering for later use:
function initTextures() { cubeTexture = gl.createTexture(); gl.bindTexture(gl.TEXTURE_2D, cubeTexture); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE); }
updateTexture()
function looks like; this is where the real work is done:function updateTexture() { gl.bindTexture(gl.TEXTURE_2D, cubeTexture); gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true); gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, videoElement); }
You've seen this code before. It's nearly identical to the handleTextureLoaded()
routine in the previous example, except when we call texImage2D()
, instead of passing an Image
object, we pass in the <video>
element. WebGL knows how to pull the current frame out and use it as a texture.
updateTexture()
is called each time we're ready to redraw our scene, by the drawScene()
function, to which the only change is adding a call to updateTexture()
before doing anything else.
That's all there is to it!
View the complete code | Open this demo on a new page