The End of Rasterization
We have been interested in the idea of raytracing replacing rasterization for real-time computer graphics for quite a while. At a recent event held by Intel, we were able to meet with a team at Intel dedicated to advancing the performance and quality of raytraced gaming.
Most frequent PC Perspective readers will already know what ray tracing is and how it differs from current generation rasterization rendering methods used in all modern games. It is interesting though to see quick definitions of ray tracing and rasterization next to each other.
Raytracing (or ray tracing depending on your preferences): Ray tracing is a general technique from geometrical optics of modeling the path taken by light by following rays of light as they interact with optical surfaces. It is used in the design of optical systems, such as camera lenses, microscopes, telescopes and binoculars. The term is also applied to mean a specific rendering algorithmic approach in 3D computer graphics, where mathematically-modelled visualisations of programmed scenes are produced using a technique which follows rays from the eyepoint outward, rather than originating at the light sources. It produces results similar to ray casting and scanline rendering, but facilitates more advanced optical effects, such as accurate simulations of reflection and refraction, and is still efficient enough to frequently be of practical use when such high quality output is sought.
The term rasterization can in general be applied to any process by which vector information can be converted into a raster format.
In normal usage, the term refers to the popular rendering algorithm for displaying three-dimensional shapes on a computer. Rasterization is currently the most popular technique for producing real-time 3D computer graphics. Real-time applications need to respond immediately to user input, and generally need to produce frame rates of at least 20 frames per second.
Compared to other rendering techniques such as radiosity and raytracing approaches, rasterization is extremely fast. However, it is not based on physical light transport and is therefore incapable of correctly simulating many complex real-life lighting situations.
The description or rasterization mentions the failing point for which ray tracing has the answer: incapable of correctly simulating light. By far the most frequently addressed problem in current gaming engines, lighting is the basis for which we can create realistic images on the PC. Both NVIDIA and AMD/ATI have spent years developing ways to more-closely simulate light on their GPUs and then helping game designers implement them in gaming titles. The results have been impressive, but nothing will beat the ability of ray tracing to properly render a scene to the most minute detail.
There is only one problem of course: ray tracing is heavily compute bound. A new Intel employee (and published writer on PC Perspective) is hoping that Intel’s new hardware and his team’s new software will soon bring ray tracing to the gamer for real-time, high resolution applications.
Daniel Pohl’s Raytraced Quake History
For those of you that would like a general purpose overview of the ray tracing style of game rendering, what its drawbacks and advantages are, I HIGHLY suggest you read over Daniel Pohl’s previous article on the subject right here at PC Perspective. You’ll be able to find all the details there and as I quickly run through some key points here and move onto the newest developments from Daniel and his teammates since his move to Intel.
Quake 3 Ray Traced
The first raytraced game coded by Daniel was a port of the Quake 3 engine and used the textures and designs from the original game. He replaced the rasterization engine with a new raytracing engine built on the OpenRT ray tracing library.
Quake 4 Ray Traced
Daniel soon followed it up with a version of Quake 4 with a ray tracing engine and was then discovered by Intel — Intel saw the potential in this type of game rendering engine and hired Daniel to help with a ray tracing project based on future Intel hardware. He has since put in a lot of work on the project and been demonstrating it at shows and gaming expos across the world.
At Fall IDF 2007, I met with Daniel and he showed what had changed and was able to provide with some VERY interesting information about the project and ray tracings future.