2 min read

Last week, Reinder Nijhoff, a computer vision researcher, created a project that does real-time ray tracing in WebGL1 using Nvidia’s RTX graphics card. This demo was inspired by Metro’s real-time global illumination.

This demo uses a hybrid rendering engine created using WebGL1. It renders all the polygons in a frame with the help of traditional rasterization technologies and then combines the result with ray traced shadows, diffuse GI, and reflections.

Credits: Reinder Nijhoff

What is the ray tracing technique?

In computer graphics, ray tracing is a technique for rendering 3D graphics with very complex light interactions. Basically, in this technique, an algorithm traces the path of light and then simulates the way the light will interact with the virtual objects.

There are three ways light interacts with the virtual objects:

  • It can be reflected from one object to another causing reflection.
  • It can be blocked by objects causing shadows.
  • It can pass through transparent or semi-transparent objects causing refractions.

All these interactions are then combined to determine the final color of a pixel.

Ray tracing has been used for offline rendering due to its ability to accurately model the physical behavior of light in the real world. Due to its computationally intensive nature, ray tracing was often not the first choice for real-time rendering. However, this changed with the introduction of Nvidia RTX graphics card as it adds custom acceleration hardware and makes real-time ray tracing relatively straightforward.

What was this demo about?

The project’s prototype was based on a forward renderer that first draws all the geometry in the scene. Next, the shader used to rasterize (converting an image into pixels) the geometry, calculates the direct lighting. Additionally, the shader also casts random rays from the surface of the rendered geometry to collect the indirect light reflection due to non-shiny surfaces using a ray tracer.

The author started with a very simple scene for the prototype that included a single light and rendered only a few spheres and cubes. This made the ray tracing code pretty much straightforward. Once the prototype was complete, he wanted to take the prototype to the next level by adding more geometry and a lot of lights to the scene.

Despite the complexity of the environment, Nijhoff wanted to perform ray tracing of the scene in real-time. Generally, to speed up the ray trace process, a bounding volume hierarchy (BVH) is used as an acceleration structure. However, when using WebGL1 shaders it is difficult to pre-calculate and use BVH. This is why Nijhoff decided to use a Wolfenstein 3D level for this demo.

To know more in detail, check out the original post shared by Reinder Nijhoff.

Read Next

Unity switches to WebAssembly as the output format for the Unity WebGL build target

NVIDIA shows off GeForce RTX, real-time raytracing GPUs, as the holy grail of computer graphics to gamers

Introducing SCRIPT-8, an 8-bit JavaScript-based fantasy computer to make retro-looking games

 


Subscribe to the weekly Packt Hub newsletter. We'll send you the results of our AI Now Survey, featuring data and insights from across the tech landscape.