GPU Testbed – Sandy Bridge-E, X79, New Games
For the Radeon HD 7970 3GB review (and all those going forward) we decided it was high time we replaced the somewhat dated Nehalem-based infrastructure (even though honestly, it was fast enough) with something a bit more current. Obviously that meant going with the new Intel Sandy Bridge-E processor and X79 motherboard. By combining support for 40 PCI Express lanes and 3-4 full size GPU slots it makes for the perfect GPU base.
From this point on, our reviews will based around the following system:
- Intel Core i7-3960X CPU
- ASUS P9X79 Pro motherboard
- Corsair DDR3-1600 4 x 4GB Vengeance memory
- 600GB Western Digital VelociRaptor HDD
- 1200 watt Corsair Professional Series power supply
- Windows 7 SP1 x64
The ASUS P9X79 Pro
The Intel Core i7-3960X gives us the fastest consumer-level CPU on the market to help eliminate the possibility of any processor-based bottlenecks in our testing (whenever possible). There are still going to be some games that could use more speed (Skyrim comes to mind) but for our purposes this is as good as you get without getting into any kind of overclocked settings. The ASUS P9X79 Pro motherboard has enough space for three dual-slot graphics cards when the time comes for testing 3-Way SLI and CrossFire, and 8 DIMM slots should we want to go up from our current setup of 16GB of Corsair Vengeance memory.
I chose to stick with the 600GB VelociRaptor hard drive rather than an SSD as our total installation size with Windows 7 SP1 x64 and 6+ games was already hitting the 115GB range. Finally the 1200 watt power supply from Corsair offers up more than enough juice for three power hungry graphics cards while running quietly enough to not throw off our noise testing drastically.
Speaking of noise, for this article we are re-introducing our sound level testing thanks to the Extech 407738 Sound Level Meter capable of monitor decibel ratings as low as 20db. This allows me to accurately tell you the noise levels generated by the graphics cards at PC Perspective.
Along with the new hardware configuration comes a host of new games. For this review we will be using the following benchmarks and games for performance evaluation:
- Battlefield 3
- Elder Scrolls V: Skyrim
- DiRT 3
- Batman: Arkham City
- Metro 2033
- Deus Ex: Human Revolution
- 3DMark11
- Unigine Heaven v2.5
This collection of games is both current and takes into account several different genres as well – first person role playing, third person action, racing, first person shooting, etc. 3DMark11 and Unigine Heaven give us a way to see how the cards stack up in a more synthetic environment while the real-world gameplay testing provided by the six games completes the performance picture.
With a soaring price tag like $999, the GeForce GTX 690 4GB has few, if any, direct competitors. We are going to be breaking up the review into two segments: best single graphics card and best dual-GPU solution. The first is simple, what are the best single cards on the market to go against the GTX 690? You have the new GTX 680, the Fermi-based GTX 590 dual-GPU card and the dual-GPU Radeon HD 6990. Both the GTX 590 and the HD 6990 are impossible to find, but they are the reigning kings.
The second comparison will look at the new GTX 690, a pair of individual GTX 680 cards in SLI and a pair of Radeon HD 7970 3GB cards in CrossFire.
- NVIDIA GeForce GTX 690 4GB – $999
- NVIDIA GeForce GTX 680 2GB – $499
- NVIDIA GeForce GTX 590 3GB – $700 (EOL)
- AMD Radeon HD 6990 4GB – $799 (EOL)
- NVIDIA GeForce GTX 690 4GB – $999
- NVIDIA GeForce GTX 680 2GB SLI – $998 ($499/each)
- AMD Radeon HD 7970 3GB CrossFire – $958 ($479/each)
We used the latest driver versions on the HD 7900 cards (Catalyst 12.4) and for NVIDIA’s new GTX 690 we had a pre-release version of the 301.33 driver.
The comparisons you should be paying particular attention to:
- NVIDIA GTX 690 4GB vs GTX 590 3GB and HD 6990 4GB – The battle for the fastest single graphics card in the world starts here…
- NVIDIA GTX 690 4GB vs GTX 680 SLI – Is there a noticeable performance difference between the single card SLI solution and getting two separate GTX 680 cards and running SLI?
- NVIDIA GTX 690 4GB vs HD 7970 3GB CrossFire – While we still don’t have a Radeon HD 7990 in our hands, how close does a pair of HD 7970 cards actually get us? Can the upcoming AMD New Zealend card keep up?
Now, with that out of the way, let’s get on with the results and see how the new Kepler dual-GPU card performs!!
I am I the only one out there
I am I the only one out there that thinks this card is to expensive? This is what not making since to me the GTX580 599 when it came out. GTX590 699-799 when it came out. GTX680 499. GTX690 1000.
I am sorry but that cost is way to high for a single video card regardless if it’s 2 chips. I hope nvidia falls flat on there face. But has they say there is a lot of stupid people who will buy this card at that price.
Blame AMD for it. They are
Blame AMD for it. They are late with HD 7990.
The reason the price is so
The reason the price is so high is because IT IS what they say it is: fastest card on the planet. Let me put it simply: 680= $500K Ferrari. 690= million dollar F1. 690 is for those enthusiasts with the wallet.
I think with the 580 and 590,
I think with the 580 and 590, the 590’s performance didn’t match that of two 580s in SLI while the 690 does match two 680s in SLI. I think from that perspective the price makes some sense. I’m not happy to see the return of the thousand dollar video card, but there are other options too.
Although the Mars 2 was considerably more card than two 580s yet it wasn’t twice the price. Then there is the whole issue of it being impossible to buy a Mars card.
I don’t see what’s wrong with
I don’t see what’s wrong with a dual GPU card being $1,000 if it’s okay for the single GPU card to be $500. Does that mean you think two single GPU cards for $500 each is okay, but stuffing a single $1,000 card is just “too much”?
Yeah, it’s over-priced, but exactly what kind of price point would you plan on hitting with the different offerings they have right now, without severely undercutting the competition, unnecessarily?
Besides, chances are they’ll be lower in a few months.
No, your the only one out
No, your the only one out their that is publicly complaining about the price tag of this card and how broke you are! Lol…calling people idiots that can afford to burn their $$ on this card? Really? I have this card and it smashes all games I load on it with all the setting set to ultra and well my eyes can only pickup 30 fps but believe you me the game play is excellent. Crysis, all settings are set to maximum and I’m not getting any tears or stutter…very smooth game play and in my book well worth the buy. I’ll be looking at buying a second card next month why? Because I run x3 monitors and I will be playing 3D surround but the other reason is because I can not because I’m an idiot.
Very awesome.
But every time
Very awesome.
But every time messages like these pop up my first thought is: ‘you wasted a bucketload of cash’.
Ofcourse its entirely up to you, but the funny thing is that in the price segment of 500 and up, you are wasting tons of money on percentiles of performance gain.
Don’t kid yourself.
After see this review I’d
After see this review I’d rather wait for the GTX680 prices to come down to SLI them. I have one already and am not having issues running any of my games at their highest setting on my current monitor’s(1600*1200)resolution.
You should honestly get a new
You should honestly get a new monitor- you arent taking advantage of the 680.
But keeping this resolution
But keeping this resolution will extend the life of my card because it will have to push less pixels.
If you wait any longer, you
If you wait any longer, you will likely never get a 690 GTX. I highly doubt they will make many of these cards. You can’t get a 590 anymore, that’s for sure.
I got my GTX 590 two weeks
I got my GTX 590 two weeks ago :p
Hello, every time i used to
Hello, every time i used to check blog posts here early in the dawn, for the reason that i enjoy to
gain knowledge of more and more.
My page: Jade Ludlow
You all seem to be forgetting
You all seem to be forgetting that the 690 also has twice the vram of a single 680. This will matter if you plan on playing with 3 screens. I have 3 gtx 570 2.5 gig and battlefield 3 is using just under 2 gig vram, and the game is not even maxed out. Fyi 2 680s with 2 gig vram does not add up to 4 gig vram in sli. The system can only utilize the vram on one card. So there you have it, the 690 in my opinion is actually worth more than 2 680, do to the 4 gigs of vram.
It’s the same deal on the
It’s the same deal on the 690. In the case of normal sli the textures need to be loaded into the memory for both cards and thus the memory is not added together. The same is true for the 690. It has 2gb per core and those same textures need to be loaded into both. And so the memory per core is exactly the same. In the case of these dual gpu cards you always half the memory that it says on the box to give you the accurate reading for the amount of memory that is going to be used.
Would there be any difference
Would there be any difference between SLI 680’s and a single 690? I would think that although the 680’s will take more space, that the dedicated slot will improve bandwidth capability (or maybe not because of the sli adapter?).
I am also very curious
I am also very curious between the actual pro’s and cons performance wise when comparing to this card to 2 680s in SLI. I am no savvy on mulit card setups but on the surface I see the benefits on only using a 2 slot form factor and less power with the 690, but worry a single point of failure with a very complex and hot running component. Would love to know if there are any performance advantages gained by having it on once PCB vs 2 cards in SLI.
I thought someone might ask
I thought someone might ask that – https://pcper.com/news/Graphics-Cards/Why-would-you-want-buy-GTX690
The GTX 690 does make better
The GTX 690 does make better sense than a 680 SLI setup if you are buying right now. If I were in the market though, I’d simply buy a 680, wait a year for the prices to come down, and then buy another. Yeah, I know, you’re not getting the insane frame rates right now, but seriously, if you need something faster than a 680 at this time…let’s just say I’d love to see the monitor setup!
Unfortunately, I have seen
Unfortunately, I have seen benchmarking that suggestions quad SLI (dual 690s) is actually worse performance than triple SLI or possibly even dual SLI.
The benching was not entirely conclusive as to whether this was a driver limitation, application (game) limitation, or a combination of the two.
Tri and quad SLI are
Tri and quad SLI are well-known to scale badly, so this is nothing new.
The best scaling is achieved with Dual SLI. Above that, consider upgrading your card a notch instead of adding a third or fourth.
Let alone the issues concerning heat dissipation and power supply. For these last two, the 690 offers a (partial) solution but for the primary argument, the 690 changes nothing about the scalability of quad SLI.
Bjorn3D did a comparison
Bjorn3D did a comparison between two GTX 680s and the GTX 690.
Well…so did we…?
Well…so did we…?
Ryan,
In your 3d vision
Ryan,
In your 3d vision performance investigations with the GTX 690, well, how did you receive those high averages when V-sync is enabled with 3d vision?
Always go with the single gpu
Always go with the single gpu option for anyone thinking of going SLI. I have sli 570’s right now and it’s beautiful when it works but a LOT of games don’t properly support it and never will. Until the technology gets better, go for a better single gpu IMO. Maybe not one that costs 1k but still.
I have personally yet to
I have personally yet to discover the first game that would actually benefit from SLI, that doesn’t support SLI setups.
Sure, your retro game collection isn’t supported, but who needs an SLI setup for that?
Far Cry 3 was playable in SLI from day one, and I had zero issues with the 310.70/64 drivers. 80-110 fps steady with all possible candy on the screen. And this is with dual GTX 660 even.
In the case of my dual GTX 660 setup, I have a GTX 680 equivalent for 120 euro’s less than the cost of a 680. Done deal.
Resolution is the key for
Resolution is the key for what you purchase for graphics cards. If you do not play at 2650 or higher then you are wasting your money on this card. As I play at 5760 x 1080 I am very interested as my current 2 x 580 classifieds actually draw enough power to kick my breaker if my wife turns on the bathroom lights. I DO wish they had more vram on the cards, but for now it works just fine.
compared to sli 680 2way. the
compared to sli 680 2way. the two way beats the 690 by about 2-4%. But the 690 runs quieter, cooler and takes about 30-50watts less power.
You will not notice that minor percentage loss in games. Get one from a reputable end card partner and just RMA your 690 if one or both gpu’s fail on the card.
I am going from a 580 Classified to the 690 for my 30inch monitor = 2560 x 1600 resolution
You are a graphics monster
You are a graphics monster sir!
I would also reckon that the
I would also reckon that the 690 has less noticeable framedrops or micro stutter than dual 680, no?
Even though currently that is hardly an issue it still doesn’t look nice in bench 🙂
Wow. I’m surprised no one
Wow. I’m surprised no one mentioned this yet – Newegg is already sold out of GTX 690 and they marked it up $200! I’ve posted a few times in response to people complaining about $999.99 price tag on this GTX 690 and that I think it’s a great price considering that GTX 680s are currently selling for $550-650 retail. Of course, now newegg goes at $1199.99 with the 690. Thus is supply & demand.
Yeah, the price hikes are a
Yeah, the price hikes are a pain in the ass and kind of hit NVIDIA in the face, but there is really nothing they can do about it without getting into some other legal issues.
Why on earth is there a 6990
Why on earth is there a 6990 results, but not single 7970 results? Why is there 7970 CF results, but not a single 7970?
Very strange Ryan.
For our single card
For our single card comparison graphs, there are only four spots. The GTX 680 is the faster of the two current generation single-GPU solutions, so it seemed more relevant to include IT rather than the HD 7970. If you are still curious about how the HD 7970 compares to the GTX 680, I recommend checking out this article: https://pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-680-2GB-Graphics-Card-Review-Kepler-Motion
Or even this: https://pcper.com/reviews/Graphics-Cards/Galaxy-GeForce-GTX-680-2GB-Graphics-Card-Review
Finally, thanks Ryan for
Finally, thanks Ryan for posting 3D Vision performance.
The only problem is, how does
The only problem is, how does it show the 3D FPS over 60 fps? 3D Vision has a FPS cap of 60.
Great review PCPer!
One thing
Great review PCPer!
One thing id like to see though, with NVIDIA’s Adaptive V-Sync, and/or other Kepler/driver tricks, does the GTX 690 exibit micro-stutter?
BTW I am SO happy that NVIDIA is really taking on the V-Sync problems, (Yay Kepler!), cause thats one of the most annoying artifacts, that and micro-stutter for SLI setups. ^.^
Whether I get two GTX 680s or
Whether I get two GTX 680s or a GTX 690 will solely depend on the price. If the price is the same I will most likely go for the 690 as it is beautiful and quiet.
Dont forget about your mobo.
Dont forget about your mobo. I’ve read you need to be running pic-e 3.0 x16 to get full benefit of the 690.
Thanks for the awesome benchmarks PCPer! Been looking for these everywhere. I just bought x3 Asus 27in 3d monitors. I have one 680 and another waiting on order……
Thinking about upgrading to 670 tri-sli, but that would mean mobo upgrade:(
Well, the fact that it is
Well, the fact that it is overpriced is still a fact. But, if you can handle money and you can save properly you would be able to actually afford the card. Don’t spend money on crap, and learn to save !
if you use one card and have
if you use one card and have a 3 monitor 3d display would it have a very low fps?
Look, nobody at all. NOBODY
Look, nobody at all. NOBODY needs to buy the top of the line cards. You know why? i’ve got a 6790 AMD Radeon. and i will have that card for 4 years. After those 4 years. Yes, it might down in performance a little bit. But it’s the exact reason Cross-fire was invented. then you just get ANOTHER 6790 which in 4 years will be half of what you purchased it for.
I know by personal studies that ALL cards top of the line over 900 frigging dollars is 150% pointless to buy. there are NO games out there TODAY in the 21st century that NEED that type of card. therefore YES, it (MIGHT) outperform cards IF there were games that required that card. BUT seeing as the only highly intense game at the moment that requires Rendering at a maximum for it’s physics and a glowing picture for best performance and gameplay experience…. Is battlefield 3.
Now many can whine and moan. Offf, it’s crysis 2. It’s not. I played crysis on high setting with my 6 year old 5450Radeon. and i still have it to this day. Don’t tell me its the most stressful game. It hasn’t been proven. Since battlefield3 came out it has gone under countless awards and rewards. Many awards have been given due to their Intense realistic gameplay. Destruction 2.0 as many like to call it.. Or, it’s “Physics 2.0” truthfully named. I’ll tell you now, try playing on the office maps in Close quarters and shoot a .50cal down the cubicles and tell me those physics doesn’t require rendering up the ass.
Spending 900 dollars on a graphics card that will be useless in 10 years and useful in 50 when battlefield 4000 comes out or whatever might. is Absolutely stupid. Until gamers get the knowledge in their head correctly. They’ll never learn, and big companies will continue to make cheap products for big bucks that people can brag about and become smug.
Sorry to all the people buying 6990’s and 690GTX cards. But my AMDRadeon6790 can outperform any of those cards
Remember… It matter what MODEL, it matter what YEAR, and it matter if you give a shit to clean it. Take care of something and you won’t need to buy new crap for your computer every 3 years because “it got screwed up with dust residue” or the motor died out. Or the circuit board fried. Or i didn’t wear an anti-static to fix my computer. or my computer keeps shutting down from overheating. No. your computer shuts down because you don’t have enough power distribution cycling through your computer. Multiple rails is bad people…. Do the calculations of how many Watts your computer needs. as well as amps. you’ll find it’s not too hard. and you really don’t need to buy super power house power supplies 1000watts.
i get black screen with this
i get black screen with this card on deus exhr and it forces me to reboot PC! same with saints 3.
I bought a GEFORCE GTX 650
I bought a GEFORCE GTX 650 and it was the worse video card I’ve ever owned. It crashed left and right with frame errors and NOT DETECTED boots. My last was a RADEON with 0 issues and I always had to resort back to it when my GTX wanted to take a break.
My next purchase will be another Radeon UPgrade. I’m done with Nvidia.
Fast but BUGGY.
another thing is games, not
another thing is games, not all the games out there will let you play with 2 graphics cards, the 690 is one graphic card with 2 gpu’s thus faking out the games and allowing you to play them, which is the main reason i bought one with 3 monitors set to 1920 x 1080
THERE IS NO 50% PERFORMANCE
THERE IS NO 50% PERFORMANCE HIT HERE. THERE IS HARDLY ANY PERFORMANCE HIT. FRAMES-PER-SECOND ARE REPORTED AS PER EYE WHILE IN 3D MODE.
YOU NEED TO DOUBLE THE REPORTED FRAMERATE WHEN IN 3D MODE, AND ONLY THEN COMPARE IT TO THE 2D FRAMERATE. 3D VISION IS NOT MUCH OF A FRAMERATE KILLER, IF AT ALL.
FOR EXAMPLE, IF YOUR GAME RAN AT 120 FPS IN 2D MODE, BUT THEN REPORTS 60 FPS IN 3D MODE, THEN THERE IS NO PERFORMANCE HIT AT ALL BECAUSE IT IS 60 FPS PER EYE WHICH EQUALS TO 120 FPS. THE GAME IS STILL ACTUALLY RUNNING AT 120 FPS, AND THAT IS WHY THE FLUIDITY OF 60 FPS IN 3D MODE IS IDENTICAL TO THE FLUIDITY OF THE GRAPHICS AT 120 FPS IN 2D MODE.
YEARS LATER AND SO MANY PEOPLE, INCLUDING “PROFESSIONAL REVIEW SITES” CONTINUE NOT UNDERSTANDING THIS, AND THEY THEREFORE KEEP GIVING NVIDIA 3D VISION SUCH A BAD NAME AS THEY FALSLEY/INCORRECTLY CLAIM THAT NVIDIA 3D VISION KILLS/HALVES/SERIOUSLY DROPS FRAMERATES WHICH IS SIMPLY NOT TRUE AND COMPLETELY SHOWS THEIR MISUNDERSTANDING OF THE TECHNOLOGY.