Please note, this is a STATIC archive of website developer.mozilla.org from 03 Nov 2016, cach3.com does not collect or store any user information, there is no "phishing" involved.

WebVR — Virtual Reality for the Web

The concept of virtual reality in itself isn't new, but now we have the technology to have it working as it should be, and a JavaScript API to make use of it in web applications. This article introduced WebVR from the perspective of its use in games.

VR devices

With the popularity of Oculus Rift and a lot of other devices in production coming soon to the market, the future looks bright — we already have sufficient technology to make the VR experience "good enough" for playing games. There are many devices to chose from: desktop ones like Oculus Rift or HTC Vive, through consoles with Playstation VR, to mobile experiences like Gear VR or Google Cardboard.

Note: For more information, read our WebVR Concepts article.

The WebVR API

The WebVR API is the central API for capturing information on VR Devices connected to a computer and headset position/orientation/velocity/acceleration information, and converting that into useful data you can use in your games and other applications.

Note: There are of course other APIs useful for creating games, for example The Gamepad API for control inputs, and the Device Orientation API for handling display orientation on mobile.

Browser support and spec status

Currently browser support for the WebVR API is still experimental — it works in nightly builds of Firefox and experimental builds of Chrome (Mozilla and Google teamed up to work on the implementation together), but sooner rather than later we'll see it in regular builds.

The WebVR spec is in Editor's Draft status which means it is still subject to change. The development is led by Vladimir Vukicevic from Mozilla and Brandon Jones from Google. For more info be sure to visit the MozVR.com and WebVR.info websites.

Using the WebVR API

The WebVR API is based on two concepts — sending stereoscopic images to both lenses in your headset and receiving positional data for your head from the sensor, and those two are handled by HMDVRDevice (head-mounted display virtual reality device) and PositionSensorVRDevice.

Get the devices

To get information about devices connected to your computer, you can use the Navigator.getVRDevices method:

navigator.getVRDevices().then(function(devices) {
  for (var i = 0; i < devices.length; ++i) {
    if (devices[i] instanceof HMDVRDevice) {
      gHMD = devices[i];
      break;
    }
  }
  if (gHMD) {
    for (var i = 0; i < devices.length; ++i) {
      if (devices[i] instanceof PositionSensorVRDevice
      	 && devices[i].hardwareUnitId === gHMD.hardwareUnitId) {
        gPositionSensor = devices[i];
        break;
      }
    }
  }
});

This code will loop through the available devices and assign proper sensors to the headsets — the first devices array contains the connected devices, and a check is done to find the HMDVRDevice, and assign it to the gHMD variable — using this you can set up the scene, getting the eye parameters, setting the field of view, etc. For example:

function setCustomFOV(up,right,down,left) {
  var testFOV = new VRFieldOfView(up,right,down,left);

  gHMD.setFieldOfView(testFOV,testFOV,0.01,10000.0);
}

The gPositionSensor variable holds the PositionSensorVRDevice — using this you can get the current position or orientation state (for example to update the scene view on every frame), or reset the sensor. For example, the below code outputs position information on the screen:

function setView() {
  var posState = gPositionSensor.getState();

  if(posState.hasPosition) {
    posPara.textContent = 'Position: x' + roundToTwo(posState.position.x) + " y"
                                + roundToTwo(posState.position.y) + " z"
                                + roundToTwo(posState.position.z);
    xPos = -posState.position.x * WIDTH * 2;
    yPos = posState.position.y * HEIGHT * 2;
    if(-posState.position.z > 0.01) {
      zPos = -posState.position.z;
    } else {
      zPos = 0.01;
    }
  }

  ...

}

For a full explanation and more details of the demo this is taken from, see Using the WebVR API.

Tools and techniques

The first WebVR experiments and demos used Three.js as it's probably the most popular 3D engine for the web. See the VREffect and VRControls functions available on the Three.js github to help you implement and handle WebVR with Three.js.

Boris Smus has written about the concept of Responsive WebVR, where a single web game can be played on various devices like laptops without VR hardware, PCs with Oculus Rift, or smartphones inside Google Cardboard and still deliver a unique and valuable experience across all of them. It's like responsive design, but applied to the VR world — write once and run in any VR headset ... or without it. You can check the WebVR Boilerplate sources — it's a good example to start learning WebVR from, and a starting point for any web-based VR experience.

There's also a markup framework called A-Frame that offers simple building blocks for WebVR, so you can rapidly build and experiment with VR websites and games: read the Building up a basic demo with A-Frame tutorial for more details.

Immersion is more important than gameplay or graphics - you have to feel you're "inside" the experience. It's not easy to achieve, but it doesn't require realistic images. Quite the contrary as having a basic shapes flying around in high framerate can make a lot. Remember: experimenting is key - see what works for your game in particular.

The future of WebVR

It's happening — consumer devices are reaching the market right now, and we already have JavaScript APIs to support them on the Web. All we need now is a stable specification, good UX and UI, better hardware and more tools and libraries — and all of that is on the horizon. It's the perfect time to dive in and experiment with WebVR.

See also

Document Tags and Contributors

Tags: 
 Contributors to this page: gmarty, chrisdavidmills, end3r
 Last updated by: gmarty,