Pricing, Testing Setup and why we aren’t testing CrossFire

The details of the GTX TITAN graphics card and pricing have been quite the discussion based on the comments in our first article and the YouTube video posted on Tuesday as well.  It seems the price tag of $999 has set quite a few people off knowing that in some cases the new GK110-based part would be slower than the GeForce GTX 690 and only modestly faster than the GTX 680 alone. 

We obviously have a lot of performance numbers to look at before we can make those distinctions but there may be valid complaints there.  Here is the pricing stack the cards we are testing today:

Testing Configuration

The specifications for our testing system haven't changed much.

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX TITAN 6GB
NVIDIA GeForce GTX 690 4GB
AMD Radeon HD 7970 3GB
NVIDIA GeForce GTX 680 2GB
Graphics Drivers AMD: 13.2 beta 5
NVIDIA: 313.97 beta (GTX 680, 690)
NVIDIA: 314.09 beta (GTX TITAN)
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

What you should be watching for

  1. GTX TITAN vs GTX 680 vs HD 7970 – Does the new GeForce GTX TITAN become the fastest GPU on the market and by how much?
  2. GTX TITAN vs GTX 690 – From a single card perspective, how do the two flagship offerings from NVIDIA with $999 price tags compare?
  3. GTX TITAN vs GTX 680 SLI – A pair of GTX 680s in SLI will cost you about $930 so they are basically the head-to-head competition for a single TITAN.


Why are you not testing CrossFire??

One question I know will be asked about this review is that in our benchmarks today you will not see results from AMD CrossFire configurations in 2-Way or 3-Way combinations.  AMD is only represented by a single Radeon HD 7970 GHz Edition card while we are using both the GTX 690 dual-GPU card, GTX 680s in SLI and GTX TITANs in SLI. 

Why the bias?

If you haven't been following our sequence of stories that investigates a completely new testing methodology we are calling "frame rating", then you are really missing out.  (Part 1 is here, part 2 is here.)  The basic premise of Frame Rating is that the performance metrics that the industry is gathering using FRAPS are inaccurate in many cases and do not properly reflect the real-world gaming experience the user has.

Because of that, we are working on another method that uses high-end dual-link DVI capture equipment to directly record the raw output from the graphics card with an overlay technology that allows us to measure frame rates as they are presented on the screen, not as they are presented to the FRAPS software sub-system.  With these tools we can measure average frame rates, frame times and stutter, all in a way that reflects exactly what the viewer sees from the game.

We aren't ready to show our full sets of results yet (soon!) but the problems lie in that AMD's CrossFire technology shows severe performance degradations when viewed under the Frame Rating microscope that do not show up under FRAPS.  As such, I decided that it was simply irresponsible of me to present data to readers that I would then immediately refute on the final pages of this review – it would be a waste of time for the reader and people that skip only to the performance graphs wouldn't know our theory on why the results displayed were invalid.

Many other sites will use FRAPS, will use CrossFire, and there is nothing wrong with that at all.  They are simply presenting data that they believe to be true based on the tools at their disposal.  More data is always better. 

As I said, check on the last two pages or so this article for more information on this, and feel free to leave your feedback below!

« PreviousNext »