Why can’t raytracing be pre-rendered in video game environments?

Raytracing is the process of calculating all light that enters a camera at a given time.

Everything in the scene contributes to the image.

Objects block the light from other objects, and the light is reflected by them until it finally hits the camera.

It would be expensive to render a completely static scene in advance.You would have to render all scenes from all positions and orientations.

But once you have something moving, especially with special effects like fire and smoke, you simply cannot predict the color of the light falling through a point in a certain direction.Even for a very simple game.

However, a certain pre-selection can be made:


Light samples can be rendered in advance.These are essentially prefabricated lookup tables (in compact form such as spherical harmonics or a small cube card) that store how the light comes from all directions at certain points in the room. You can encapsulate shadows and light fluctuations for static geometry. To get the light at a certain point, you can interpolate with adjacent light samples.

The problem is that they can only be used for static scenes.

You cannot handle changes in geometry (such as building destruction or opening doors), moving objects that block the light (such as a passing truck), changes in lighting (such as time of day, explosions nearby, or switching off lights). And they can only store blurred diffused light, so they can’t be used to illuminate reflective surfaces. Of course, you can make them more complex and have multiple sets for different times of the day, or find out which light samples have an alternative version for different lights, but it gets complicated. You can also change them during the game by perhaps spending a little time processing per frame to update some of them inconspicuously. But it becomes difficult to manage, can lead to disruptions and still not quickly adapt to major events such as explosions and moving objects.


Skyboxes have been used since the first 3D games.It is a theoretically infinite box of the player showing the world so far away that you cannot reach it, like the sky and distant mountains.

They are usually rendered in advance, but you can update them in real time for time of day and weather effects, such as in the beautiful “Horizon: Zero Dawn”.


Animated surfaces are another aspect of games that can be pre-switched.

Imagine you have a large crowd, or maybe a forest, blowing in the wind. You can render them into small strips from multiple angles and display them as animated rectangles that are always facing the player. They look pretty good from a distance, and as you get closer, you can switch to real geometry.


Light scattering is another possibility that can be predicted.

How the light is influenced by the atmosphere can be predicted and stored in a texture for faster display. This can be used for beautiful sunsets or for accurate planet reproduction in space games.


There are probably many more ways to predict different aspects of light in games with Raytracing/Raycasting.

Leave a Reply