Testing Methodology and System Setup

Testing Methodology

Graphics card testing has become the most hotly debated issue in the hardware enthusiast community recently.  Because of that, testing graphics cards has become a much more complicated process than it once was.  Before you might have been able to rely on the output of a few synthetic, automatic benchmarks to make your video card purchase, that is just no longer the case.  Video cards now cost up to $500 and we want to make sure that we are giving the reader as much information as we can to aid you in your purchasing decision.  We know we can’t run every game or find every bug and error, but we try to do what we can to aid you, our reader, and the community as a whole.

With that in mind, all the benchmarks that you will see in this review are from games that we bought off the shelves just like you.  Of these games, there are two different styles of benchmarks that need to be described.

The first is the “timedemo-style” of benchmark.  Many of you may be familiar with this style from games like Quake III; a “demo” is recorded in the game and a set number of frames are saved in a file for playback.  When playing back the demo, the game engine then renders the frames as quickly as possible, which is why you will often see the “timedemo-style” of benchmarks playing back the game much more quickly than you would ever play the game.  In our benchmarks, the FarCry tests were done in this matter: we recorded four custom demos and then played them back on each card at each different resolution and quality setting.  Why does this matter?  Because in these tests where timedemos are used, the line graphs that show the frame rate at each second, each card may not end at the same time precisely because one card is able to play it back faster than the other — less time passes and thus the FRAPs application gets slightly fewer frame rates to plot.  However, the peaks and valleys and overall performance of each card is still maintained and we can make a judged comparison of the frame rates and performance.

The second type of benchmark you’ll see in this article are manual run throughs of a portion of a game.  This is where we sit at the game with a mouse in one hand, a keyboard under the other, and play the game to get a benchmark score.  This benchmark method makes the graphs and data easy to read, but adds another level of difficulty to the reviewer — making the manual run throughs repeatable and accurate.  I think we’ve accomplished this by choosing a section of each game that provides us with a clear cut path. We take three readings of each card and setting, average the scores, and present those to you.  While this means the benchmarks are not exact to the most minute detail, they are damn close and practicing with this method for many days has made it clear to me that while this method is time consuming, it is definitely a viable option for games without timedemo support.

The second graph is a bar graph that tells you the average framerate, the maximum framerate, and the minimum framerate.  The minimum and average are important numbers here as we want the minimum to be high enough to not affect our gaming experience.  While it will be the decision of each individual gamer what is the lowest they will allow, comparing the Min FPS to the line graph and seeing how often this minimum occurs, should give you a good idea of what your gaming experience will be like with this game, and that video card on that resolution.

Our tests are completely based around the second type of benchmark method mentioned above — the manual run through.

System Setup and Comparisons

Our testing time with UT3 was long!  I decided to stick to the single card performance for this first run through in order to extend my testing away from just a few cards.   I tested NVIDIA’s 8800 GTX, 8800 GTS in both 640MB and 320MB configurations and their 8600 GTS card.  For AMD, I tested both the HD 2900 XT and the HD 2600 XT. 

In our benchmark results, you’ll see two pages of graphs for each map.  The first set of graphs tests the top three cards, the 8800 GTX, 8800 GTS 640MB and the HD 2900 XT, at resolutions of 1600×1200, 2048×1536 and 2560×1600.  The second set of graphs will look at the three cheaper cards, the 8800 GTS 320MB, 8600 GTS and HD 2600 XT, at resolutions of 1600×1200 and 2048×1536.

Another testing note — antialiasing was not running for any of these tests.  Just like the previous UT3 engine games we have tested (Rainbow Six: Vegas and Bioshock) the game does not have any options for AA and forcing on AA at the control panel resulted in no image quality differences.  As for future AA support, I am told that by the second demo release and the retail release, Epic should have a patch out for the Unreal Engine 3 that will enable AA support across the board.  Let’s hope so — missing such a standard feature in an engine like this is just crazy!

Test System Setup


Intel Core 2 Extreme X6800 – Review


EVGA nForce 680i MotherboardReview
Intel 975XBX Motherboard (for CrossFire testing)


Corsair TWIN2X2048-8500C4

Hard Drive

Western Digital Raptor 150 GB – Review

Sound Card

Sound Blaster Audigy 2 Value

Video Card

AMD ATI Radeon HD 2900 XT – Review
AMD ATI Radeon HD 2600 XT
EVGA GeForce 8800 GTS 640MB
NVIDIA Reference GeForce 8800 GTX – Review
NVIDIA GeForce 8800 GTS 320MB
NVIDIA GeForce 8600 GTS

Video Drivers

AMD Catalyst – 7.10 Beta
NVIDIA Forceware 163.69

Power Supply PC Power and Cooling 1000 watt

DirectX Version

DX10 / DX9c

Operating System

Windows Vista Ultimate 64-bit

  • Unreal Tournament 3 Demo
UT3 Demo Test Settings

Unreal Tournament 3 Performance Preview - AMD and NVIDIA Compared - Graphics Cards 77

Unreal Tournament 3 Performance Preview - AMD and NVIDIA Compared - Graphics Cards 78

There aren’t a whole lot of options for graphics in the demo right now, but we did set the texture and world detail levels up to their peak at 5.  V-Sync was disabled though the game does have a 60 FPS lock on it so scores on the higher end cards are going to look closer than they might otherwise be. 

You might notice the “hardware physics” box in the first setup screen — we didn’t test that here today but we’ll definitely be plugging in an AGEIA PhysX card to see what changes soon!
« PreviousNext »