The two remaining filmic effects that Valve discussed with us are not yet available in the Source engine since no hardware available today can effectively apply these effects with anything close to a playable frame rate. That’s right — not even the dual 7800 GTX 512s you have thereâ€¦
First, a quick technical precursor on motion blur in various media types will help us better understand Valve’s desire to correctly implement this feature in the Source engine. Films (and TV) and games present a sequence of images to user that the user then perceives as motion. These frame rates as we know them are nearly always set at a fixed frame rate; 24 Hz (or FPS) for film, 24.97 Hz for video and whatever your monitor’s refresh rate is set at on your PC (typically anywhere from 60 Hz to 100 Hz).
What differs between the film and TV media and the game media is how the images are ‘recorded.’ Film is ‘exposed’ to a continuously moving world for 1/24th of a second and all the motion is captured as well. We all know this is as the typical blur we see in both modern movies and in home videos. This adds what Valve calls ‘weight’ to the image and causes you to perceive more detail that a still image would show. You might even think of this as anti-aliasing in time, rather than in space (which is what current AA algorithms do in modern games).
Games are a different story though; they are not recorded at all obviously but rendered in real time. Each image you see that is on the screen for 1/60th (or whatever refresh rate you have) of a second is only generated from a single instance in time. These infinitesimal instances do not have any of the motion or blurring effects that film and video have because software and hardware must render a single image to very minute specifications.
The answer is somewhat obvious, though daunting from a hardware perspective. Instead of rendering one image for every sync time on the monitor, Valve would like to render a series of them over a finite duration of time.
Basically, what you need to imagine is the Source engine rendering one frame after another as it normally would, but instead of outputting them to the monitor, rendering them to memory and compounding them on top of one another into an accumulation buffer. When the finite amount of time has been reached (either in frames or in physical time) then the result would be output to the monitor at its sync time as a single frame.
What you then get is a good simulation of a film camera that effectively captures the artificial ‘motion’ that has been going on in the game (being rendered only to memory though) and thus the image that is rendered to the screen has an impressive motion blur affect on it; though it almost isn’t really an effect so much as it is a simulation of visual physics.
Doing the math, you’ll realize that if Valve has a goal of a 30 FPS mark on their game, and they decide that 20 frames are necessary to get the correct amount of motion blur, that would require a total of 600 frames being rendered for each second of game play! Now you no doubt can see why implementing this feature in the Source engine would simply be a joke in its current form with current hardware.
Video link – No Effects (~ 45 MB)
Video Link – Split Screen – No effects vs With Motion Blur (~ 35 MB)
The accumulation buffer structure that is at work here is really a ‘brute force’ method of implementing a feature like this. It is both the most ‘true’ method as well as the most hardware intensive. Valve readily admitted that even when the motion blur effect goes live in the Source engine it will most definitely not be using this very simple yet powerful method and would instead use a slightly compromised approximation of it to lower the amount of horsepower the GPU needs to render it.