Testing Methodology and System Setup

GPU Performance Testing Methodology

Graphics card testing has become the most hotly debated issue in the hardware enthusiast community recently.  Because of that, testing graphics cards has become a much more complicated process than it once was.  Before you might have been able to rely on the output of a few synthetic, automatic benchmarks to make your video card purchase, that is just no longer the case.  Video cards now cost up to $500 and we want to make sure that we are giving the reader as much information as we can to aid you in your purchasing decision.  We know we can’t run every game or find every bug and error, but we try to do what we can to aid you, our reader, and the community as a whole.

With that in mind, all the benchmarks that you will see in this review are from games that we bought off the shelves just like you.  Of these games, there are two different styles of benchmarks that need to be described.

The first is the “timedemo-style” of benchmark.  Many of you may be familiar with this style from games like Quake III; a “demo” is recorded in the game and a set number of frames are saved in a file for playback.  When playing back the demo, the game engine then renders the frames as quickly as possible, which is why you will often see the “timedemo-style” of benchmarks playing back the game much more quickly than you would ever play the game.  In our benchmarks, the FarCry tests were done in this matter: we recorded four custom demos and then played them back on each card at each different resolution and quality setting.  Why does this matter?  Because in these tests where timedemos are used, the line graphs that show the frame rate at each second, each card may not end at the same time precisely because one card is able to play it back faster than the other — less time passes and thus the FRAPs application gets slightly fewer frame rates to plot.  However, the peaks and valleys and overall performance of each card is still maintained and we can make a judged comparison of the frame rates and performance.

The second type of benchmark you’ll see in this article are manual run throughs of a portion of a game.  This is where we sit at the game with a mouse in one hand, a keyboard under the other, and play the game to get a benchmark score.  This benchmark method makes the graphs and data easy to read, but adds another level of difficulty to the reviewer — making the manual run throughs repeatable and accurate.  I think we’ve accomplished this by choosing a section of each game that provides us with a clear cut path. We take three readings of each card and setting, average the scores, and present those to you.  While this means the benchmarks are not exact to the most minute detail, they are damn close and practicing with this method for many days has made it clear to me that while this method is time consuming, it is definitely a viable option for games without timedemo support.

The second graph is a bar graph that tells you the average framerate, the maximum framerate, and the minimum framerate.  The minimum and average are important numbers here as we want the minimum to be high enough to not affect our gaming experience.  While it will be the decision of each individual gamer what is the lowest they will allow, comparing the Min FPS to the line graph and seeing how often this minimum occurs, should give you a good idea of what your gaming experience will be like with this game, and that video card on that resolution.

Our tests are completely based around the second type of benchmark method mentioned above — the manual run through.

System Setup and Comparisons

Testing the graphics performance of Skulltrail was a many-fold job: we tested single card and dual card graphics performance as well as performance scaling in SLI and CrossFire technologies.  Our comparison system is based on a dual-core, single CPU platform and should indicate to us how much of a difference a move to an 8-core system makes for high-end gaming. 
 

Graphics Performance Test System Setup

CPU

Intel Core 2 Extreme X6800 – Review

Motherboards

EVGA nForce 680i MotherboardReview
Intel 975XBX Motherboard (for CrossFire testing)

Memory 

Corsair TWIN2X2048-8500C4

Hard Drive

Western Digital Raptor 150 GB – Review

Sound Card

Sound Blaster Audigy 2 Value

Video Card

AMD Radeon HD 3870 512GB
NVIDIA GeForce 8800 Ultra 768MB

Video Drivers

AMD Catalyst R680 Beta
NVIDIA Forceware 169.28

Power Supply PC Power and Cooling 1000 watt

DirectX Version

DX10 / DX9c

Operating System

Windows Vista Ultimate 64-bit


Skulltrail Test System Setup

CPU

Intel Core 2 Extreme QX9775

Motherboards

Intel Skulltrail D5400XS motherboard

Memory 

 2  x 2GB Crucial DDR2-800 FB-DIMM

Hard Drive

Western Digital Raptor 150 GB – Review

Sound Card

Sound Blaster Audigy 2 Value

Video Card

AMD Radeon HD 3870 512GB
NVIDIA GeForce 8800 Ultra 768MB

Video Drivers

AMD Catalyst R680 Beta
NVIDIA Forceware 169.258

Power Supply PC Power and Cooling 1200 watt

DirectX Version

DX10 / DX9c

Operating System

Windows Vista Ultimate 64-bit


  • Call of Duty 4
  • Lost Planet
  • World in Conflict
  • Crysis

CPU Performance Testing Methodology

For our CPU and platform testing, we used our standard CPU benchmarking setup to compare the Skulltrail platform to our compilation of dual-core and quad-core processors from both Intel and AMD.

Intel Skulltrail Platform Review - Eight Cores, SLI and CrossFire - Motherboards 134

Intel Skulltrail Platform Review - Eight Cores, SLI and CrossFire - Motherboards 135

Obviously our Skulltrail test bed is different than in the tables seen here, but is exactly the same as the Skulltrail table listed in our GPU performance test configurations.  (Though, to be fair, the Skulltrail was configured with an NV 8800 GTX for CPU testing.)

« PreviousNext »