
Ray tracing was first brought into the world of gaming by Nvidia’s RTX 200 graphics cards in 2019. Since then we have seen the release of numerous RTX and GTX cards that can implement this technique. Championed as one of the greatest advances graphic technology history, ray tracing at its core is a method of accurately mimicking real-world lighting within a computer-generated environment.
It’s been used to replicate the effects of light and shadows in Movies and TV for years now. Yet, despite the fact it’s been around for such a long time it still takes a large amount of available computing power to carry out. Ray tracing is also currently being used in PC games, but only in little amounts. We are still yet to see a fully ray-traced game.
However, this is all about to change as ray tracing is set to be at the forefront of the next-gen gaming experience. Recently the term has become somewhat of a buzzword and there seems to be a lot of questions flying around. Gamers want to know what this technology is and what it means for the future of gaming and we’re here to tell you…
So What is Ray Tracing?

Over the years we’ve seen many improvements in graphics and game lighting effects. We’ve come a long way from the 230 polygons Lara Croft was made of in the first Tomb Raider. We’re now at a stage where a single character can use 150,000 polygons or more without much impact on the hardware.
Ray Tracing is set to take the advancement in video game graphics much further. In a nutshell, it’s an advanced technique that is used in order to render realistic light. Although, it’s less about rendering the light itself and more about mimicking the way light interacts with objects in the real world. The technique was first conceptualised in 1969 and is widely used in the development of CGI in movies and TV. It is much easier to apply the technique in these types of media than in gaming (more on that later). Hence why it has taken so long to make its way to the gaming world.
At its core, Ray Tracing is an algorithm that can trace the path of light and simulate its behaviour when interacting with computer-generated objects in a virtual world. Using this technique, game designers can deliver us a dramatic improvement in the creation of lifelike shadows and reflections and also allow for realistic translucence and light scattering effects.
How Does it Work?

In real life, a beam of light is made up of loads of tiny energy particles called photons. They travel from the point of origin until they interact with an object at which point its path is determined by that object’s properties. Dark objects will absorb the light while lighter, shinier objects will reflect it etc. These photons then reach your eye allowing your brain to interpret the picture in front of you.
Ray Tracing pretty much works the same as this, except the process is carried out backwards. The traced beam begins at the viewing point and then traces a path towards the light source, simulating how the objects along the way would affect the light beam and vice versa.
The reason that the technique simulates vision backwards is that it’s a far more efficient use of computational power tracing the rays from the light source. It ensures that the computer isn’t wasting its processing power on objects that are out of the camera’s view. This makes a huge difference as it would take much more power to render the light rays emitted from all light sources in a given scene.
But it’s still not that simple. We’re talking about billions of photons entering the eye every second. This is more calculations than a computer can handle. This means that measures have to be taken to ensure an optimum level of efficiency and still render a realistic image.
So, instead of mapping out every single ray of light, the solution that the developers at Nvidia came up with was to only trace the most important rays. Then a process called “denoising” machine learning algorithms are used to fill in the gaps and smooth everything out.
So, What’s The Catch?

As we mentioned earlier Ray Tracing has been around for years. You’ve seen it used in CGI for tons of movies and TV Shows (check out the beautifully ray-traced image of Thanos above) but the reason it’s taken so long to implement in video games is the fact that it takes so much computational power to run. Tony Tomasi, vice president of technical marketing at Nvidia said that a video game needs to run at 60 or 120 frames per second, meaning it would need to compute each frame in 16 milliseconds.
Film are pre-rendered so they will usually take anywhere from 8 to 24 hours to render a single frame without any effect on the finished product because they don’t have to render in real-time. Not only this but more often than not they also have whole server farms at their disposal. At the time of writing this article, we are limited by the technology available to us. Even the more powerful PC’s only have so much GPU power available and let’s not even get started on modern games consoles! Games, however, do need to be rendered in real-time, so the amount of calculations needed to properly apply this to gaming are pretty much physically impossible.
Recent Attempts

Some games have attempted to take it further, Metro Exodus, for instance, ray traces all of the natural light sources in the game. Essentially, ray tracing whole scenes. This has effects such as making bedrooms appear darker with light only breaking through at the source. The subtle changes it brings to the surrounding environment are nothing short of amazing. There are many notable changes that can be seen with ray tracing enabled, for example, there is a clear distinction between wet and dry dirt or the way the light bounces off of the sand, reflecting on the stones carrying the hue of the environment or even how a character’s skin is more natural-looking.
The effect that accurate lighting effects have on the aesthetics of gameplay is near boundless in the replication of real-life scenes. For example, without ray tracing on using a flashlight is pretty much pointless, but with it turned on players will have a lot of trouble navigating without it.
All of this being said, the implementation of ray tracing in Metro Exodus is far from perfect and is pretty much just an early glimpse of what this technology has to offer. It has serious negative effects on the game’s performance, even with the use of features like deep learning super-sampling which are designed to counteract this, and this is only a game where the natural light sources are ray traced.
Ray tracing needs the most powerful of modern graphics cards to run effectively. Most players don’t have access to this expensive technology and even when they do have what’s needed it has serious effects on performance.
All of this combined leaves game developers without much reason to push for its implementation since most of their users can’t benefit from it anyway. Of course, all of this is set to change very soon with the release of the next-gen consoles which have promised the eventual full implementation of ray tracing.
What Does This Mean For Gaming?

Historically, video games have used a technique known as rasterization. This technique renders computer graphics much quicker than Ray Tracing but it’s not nearly as realistic. During this process, the computer takes a 3D scene and measures the distance that objects are from the light source and which objects are in front of each other. It then uses this information to decide what colour each pixel should be. Essentially converting 3D graphics into 2D pixels to display on your screen.
In an interview with Gamer Nexus, NVIDIA’s Tom Peterson said that its changing CGI rendering from the artistic way of creating images that is currently used in gaming, to a physics-based way of creating images.
Over time, the current technology has made some notable improvements using tricks like ambient occlusion to add extra shadows independent of light sources. But most games don’t really simulate the way light behaves in the real world. They just fake it artistically with techniques. This makes it hard to create shadows and reflections that truly lifelike. Which explains why mirrors are so rarely included in video games.
Thankfully, with the release of Nvidia’s graphics cards, we are starting to see things change as PC games are beginning to take advantage of the hardware’s special capabilities. Ray tracing is now slowly but surely replacing rasterization. Though we are still going to see rasterization for the time being, eventually, we will see ray tracing lighting engines in games.
The graphics chips that will go into the next generation of gaming PC’s and games consoles will have the rendering power to produce real-time fully ray-traced scenes changing gaming forever. How soon we will actually see this happen is currently unknown, but we do know that it’s definitely on the way.