More questions and closing thoughts
PCPER: How well do you think current programmable GPU technology can handle ray tracing?  Do you think dedicated ray tracing logic is going to be required?

Crytek's Cevat Yerli Speaks on Rasterization and Ray Tracing - Graphics Cards 8
Crysis, 2007; image courtesy freakygaming.com

Yerli: Current GPU hardware can handle ray tracing, but it’s not well designed for this problem. The trend is towards more generic programmable massively parallel solutions, so I don’t think we’ll require dedicated ray tracing hardware.

PCPER: Do you see the possibility of combining rasterization and ray tracing in future rendering engines?

Yerli: There are a variety of graphics problems which would suit a hybrid solution of rasterization and ray casting, and most likely is the way to go, whilst I think there is at least one more generation with almost pure rasterization, very clearly any proposed graphics hardware architecture will perform great in pure rasterization.

PCPER: If you have considered any of Intel’s upcoming Larrabee hardware, do you see a future for something that is described currently as psuedo-x86 80-cores processors?

Crytek's Cevat Yerli Speaks on Rasterization and Ray Tracing - Graphics Cards 9
Crysis, 2007; image courtesy freakygaming.com

Yerli: The hardware trend is for increasing programmability at higher parallelism. There are performance advantages to keeping much of the current rendering pipeline in hardware, so it will be interesting to see whether a completely programmable solution will compete in the near future. If it can there are some interesting possibilities.

PCPER: Do you expect changes to DX or OpenGL that might include ray tracing?  Or do you see an entirely new API required for ray tracing to catch on for mainstream graphics?

Yerli: I don’t think we’ll see an OpenGL / DX like API for ray tracing. Instead we’ll see the APIs for general purpose computations on GPU like hardware which can be used for ray tracing.

PCPER: Have you attempted any simple ray tracing routines on current console hardware (namely XBox 360 and PS3).  Which would be the more attractive platform for ray tracing?  The PS3 with its Cell processor with 7 SPs, or the triple core PPC on the XBox 360?

Crytek's Cevat Yerli Speaks on Rasterization and Ray Tracing - Graphics Cards 10
Crysis, 2007; image courtesy gamespot.com

Yerli: Both platforms are not well suited to ray tracing as a replacement for rasterization, for a real world case. Hybrid solutions yes, one can afford this to some degree.

PCPER: What are you personally comfortable with?  Pixels that “look good enough” or are “truly accurate”?

Yerli: The basic principle of graphics rendering is to produce output which can be seen, pixels which look good are pixels that are good. So looks good enough pixels are just fine, considering the intensity and interactivity of games.

PCPER: Intel buys Havok and Project Offset?? What are your thoughts on what Intel might be doing with this?

Crytek's Cevat Yerli Speaks on Rasterization and Ray Tracing - Graphics Cards 11
Crysis, 2007; image courtesy gamespot.com

Yerli: I hope Intel will produce technology demos with both Havok and Offset that show to the public, how they want developers to push the real-time graphics boundaries by great practical examples.

Since we have our own scalable physics and graphics engines within CryEngine these developments don’t directly affect us or our licensees.

Closing Thoughts

Cevat has given us some more food for thought with his answers to our ray tracing questions.  On the positive side for ray tracing development he seems confident that IF ray tracing picks up as a general rendering method for games that it would likely develop algorithms and theory designs that would improve performance and efficiency dramatically, just as rasterization has done over the last 20 years.  Anti-aliasing is one such example where raster engines have developed significant improvements for efficiency that ray tracing would likely do over time as well.

Cevat does see a future for ray tracing, but more in the form of a mixed rendering design that probably won’t be implemented for several years; five or more.  By his count, for the next three years or so rasterization will continue to be the dominate rendering method for games and thus any potential graphics hardware for this market will need to compatible and perform well on rasterization.  Cevat thinks that in a time span of three to five years we might begin to see some implementation of ray tracing in games but not in the pure, classical ray tracing fashion.  Instead we will likely see the hybrid rendering techniques that we have discussed several times in previous interviews: ray tracing for shadows, certain reflective objects, etc. 



The good news for NVIDIA and AMD is that with the increase in programmability in their GPU designs and even more progression down that path, Cevat thinks that GPU hardware will likely be able to handle the types of ray tracing that would be implemented by game designers.  Yes, Intel’s Larrabee design is going to be COMPLETELY programmable but the necessary flexibility might be matched by upcoming GPU designs from NVIDIA and AMD thus limiting the advantages Larrbee might have.  It’s also possible that, as Cevat notes, the slightly fixed function in which current generation GPUs execute enables more efficient graphics rendering and the complete programmability design that Intel is implementing might not compete in terms of performance.  But that same programmability could also open up a lot of new design ideas and new rendering techniques. 

And the fact remains that pretty much everyone we talk to outside of Intel is confident that rasterization is going to have at least one more dedicated generation in gaming.  That means that if Intel intends to compete in the graphics world in the next 2-3 years that it will have to take on NVIDIA and AMD/ATI on current-model GPU terms – rasterization, DirectX and OpenGL.  By Intel’s own admission they will be compatible with current rasterization models, they have said exactly that at the recent IDF show in Shanghai, but how well they compete will really only be answers when the hardware is available to developers and reviewers. 

Many people seem to be calling for NVIDIA and AMD’s graphics departments to be be-headed once Intel enters the graphics arena; but the truth of the matter is that Intel has a lot to prove before developers and gamers will take it’s word on faith.  The debacle that has been Intel’s integrated graphics, though a very different design team, has definitely soured a lot of people’s view of Intel and graphics.  Just look at how the Microsoft Vista-capable issue is panning out or how developers blame Intel for PC gaming’s current downturn.  To take Intel just at their word, that Larrabee is something completely different and will work and perform as they claim, just seems naive.  Is Intel capable of turning out a great graphics technology?  Absolutely – they are Intel, the largest semiconductor design company in the world and they have the engineers and finances to do just about anything.  But there have been plenty of other expensive and advanced generation “failures” out there too, and some pretty hefty ones: i740 graphics, Pentium 4, original Itanium.

But Intel has an uphill battle; I am eager to follow it, document it and see what the future holds for ALL of these players in the PC landscape.

I want to personally thank Cevat Yerli, Doug Binks and Zyad Tikanouine at Crytek for taking the time to talk with me and answer these questions. 

Please join us in the forums  to discuss this incredibly interesting information!!

More Reading on Gaming and Ray Tracing:

Be sure to use our pricing engine to find the best prices on NVIDIA and AMD graphics cards and anything else you might need:

« PreviousNext »