5 min read

We as a society are on the road to mastering a powerful new medium. Even just by dipping your toes into the space of virtual reality, it is easy to find vast unexplored territory and a wealth of potential in this next stage of human communications.

As with movies around the turn of the 20th century, there is a mix of elements borrowed from older media, combined with wild experimentation. Thanks to accelerating progress, it will not take too long to home in on the unique essence of immersive virtual worlds. Yet we know there are still so many untapped opportunities around creating, capturing, and delivering these worlds and shaping the experiences inside them.

Chris Milk is a pioneer in both the content and technology aspects of virtual reality filmmaking. He produced a short film “Evolution of Verse”, available (along with many others) through his company’s app Within, that tells a story of the emergence of the VR medium in abstract form. It bravely tests the limits of immersive art, serves as an astounding introductory illustration of the potential for visual effects in virtual reality, and contains an exhilarating homage to the humanity’s initial experience of the movies.

On the Web—the platform that is the most open, connected, and accessible for both creators and audiences—we now have the opportunity to take our creations across the barrier of limitations established by the format of the screen and the norms of the platform. We can bring along those things that work well on a flat screen, and will come to rethink them as we experiment with the newfound ability of the audience to convincingly perceive our work in the first person.

What is Virtual Reality?

First, a quick survey of consumer head-mounted displays in rough order of increasing power and price:

  • Mobile: Google Cardboard, Samsung Gear VR, Google Daydream (coming soon)
  • Tethered: Sony PlayStation VR (coming soon), Oculus Rift, HTC Vive

It would be helpful to analyze the medium of virtual reality in terms of its various immersive facets:

  • Head-tracking: Most crucially, the angle of your head is mapped in real time to that of the virtual camera. A major hallmark of our visual experience of physical reality is turning your head in order to look around or to face something. This capability is leveraged by the category of 360° videos (note that the name evokes looking around in a circle, but generally you can look up and down as well). This is a radical departure in terms of cinematography, as directing attention via a rectangular frame is no longer an option.
  • Stereoscopy: Seeing depth as a 3rd spatial dimension is a major sensory advantage for perceiving complex objects and local surroundings. Though there is a trade-off between depth perception and perspective distortion, 3D undeniably contributes a sense of presence, and therefore a strong element of immersion. 3D can be compared with stereo sound, which also was looked upon for decades as a novelty and a gimmick before achieving ubiquity in the 1960s (on that note, positional audio is another significant factor in delivering an immersive experience).
  • Isolation: Blocking out ambient light noises and distractions that would dilute the visual experience, akin to noise-blocking or noise-canceling headphones.
  • Motion tracking: Enables so-called “room-scale VR”, which allows you to move through virtual environments and around virtual objects. This can greatly heighten the fidelity of experience, and comes with some interesting challenges. This capability is currently only available with the HTC Vive but we will soon see it on mobile, put forward by Google’s Project Tango.
  • Button: Works as a mouse click in combination with a cursor in the center of your field of view.
  • Motion-tracked hand controller: Again, this is currently a feature of the HTC Vive only, but Oculus and Google’s Daydream will be coming out with controllers, as will PlayStation VR using PlayStation Move controllers. Even fairly basic applications of these controllers like Tilt Brush have immense appeal.

Immersive Graphics on the Web

There is one sequence of “Evolution of Verse” that is reminiscent of one of my favorite THREE.js demos, of flocking birds. In pursuit of advanced hardware acceleration, this demo uses shaders in order to support the real-time navigation and animation of thousands of birds (i.e. boids) at once.

A-Frame is a high-level wrapper around THREE.js that provides a simple, structured interface to immersive 3D graphics. An advanced feature of A-Frame materials allows you to register shaders (a low-level drawing subroutine), and attach these materials to entities. Aside from the material, any number of other components could be added to the entity (lights, sounds, cameras, etc.), including perhaps one that encapsulates the boid navigation logic using a custom component (which are simple to write).

A-Frame has great support for importing 3D objects and scenes (downloaded from Sketchfab or clara.io, for instance) using the obj-model and collada-model components. An Asset Management System is also included, for caching and preprocessing assets like models and textures or images and videos.

In the future it will also support the up-and-coming glTF standard runtime format for objects and scenes—comparable to the PNG format but for 3D content (with support for animation, however). This piece lives as an external component for now, one of many external resources available as part of the large A-Frame ecosystem.

From flocks of birds to the many other techniques explored and validated by the WebGL community, immersive cinematic storytelling on the web has a bright future ahead. During the filming of “The Birds”, Alfred Hitchcock found it necessary to insist on literally immersing his lead actress (and surrogate for the audience) in flocks of predatory birds. Perhaps in a more harmless way yet motivated by similar dramatic ambition, creators of web experiences will insist on staging their work to take full advantage of the new paradigm of simulated reality, and it is no longer too early to get started.

Image credits: * left: cutestockfootage.com * right: NYT VR app

About the author

Paul Dechov is a visualization engineer at TWO-N.

LEAVE A REPLY

Please enter your comment!
Please enter your name here