Legacy GPU Comparison – 9800 GT, GTX 460, GTX 560 Ti
Here is where we expect to see some really interesting information for users that have 9800 GTs, GTX 460s or even the very popular GTX 560 Ti. How much performance advantage does the new GK106-based GeForce GTX 660 truly offer?
Obviously, we had to lower some of our settings to get the GeForce 9800 GT running fast enough to get some benchmarks to run. Also, we ran these games in a DX9 mode if possible, or DX10 otherwise.
Here is how we'll look at these results: performance delta from the GTX 660 to the 9800 GT, GTX 660 to the GTX 460 and then GTX 660 to the GTX 560 Ti.
9800 GT: 242%
GTX 460 1GB: 69%
GTX 560 Ti 1GB: 28%
If you upgrade from a 9800 GT you will see a 242% increase in performance and that would obviously be a game changer for anyone that dedicates time to gaming on the PC. If you move up from a GTX 460 you'll see a 69% increase in gaming performance and users of the GTX 560 Ti will see a 28% increase.
9800 GT: 218%
GTX 460 1GB: 49%
GTX 560 Ti 1GB: 15%
9800 GT: 228%
GTX 460 1GB: 67%
GTX 560 Ti 1GB: 20%
9800 GT: 187%
GTX 460 1GB: 35%
GTX 560 Ti 1GB: 11%
9800 GT: 199%
GTX 460 1GB: 63%
GTX 560 Ti 1GB: 23%
9800 GT: 225%
GTX 460 1GB: 76%
GTX 560 Ti 1GB: 30%
9800 GT: 224%
GTX 460 1GB: 44%
GTX 560 Ti 1GB: 9%
Okay, so what do we make of all of this? Users that are still gaming on the 9800 GT (and there are a lot of you) will see some very impressive gains by jumping up to the GeForce GTX 660 2GB. The GeForce GTX 460 1GB upgrade path shows an average increase of 60% or so with the move to a GK106-based graphics card. The GTX 560 Ti is more capable though and that performance gap is more like 20-25% depending on your specific game.
The power efficiency of Kepler is apparent here in our legacy card power testing – the GTX 660 2GB card uses the same power as the GTX 460 but about 35 watts LESS than the GTX 560 Ti.
Though not really important for the upgrade argument, I included this graph just out of interest of my own. Notice how the load temperature of the GPU has increased over the years, from 51C on the reference 9800 GT to 76C for today's GTX 660 release. As process technology has improved and the designs of the GPUs have progressed it seems that both NVIDIA and AMD have allowed the GPU temps to rise. (Also, the sound on the 9800 GT was INSANE!)
This is the true sweet spot
This is the true sweet spot gamers have been looking for, even NVIDIA in its own documents compares it to other greats (9800GT/ect).
I also really do like the review including legacy cards, gives a really nice comparison, and not just comparing to what is currently available on the market.
Yup, I noticed that
Yup, I noticed that comparison as well. I don't remember exactly how 9800GT compares to 8800GT (were they exactly the same, just a rebrand?) but the latter was a card that sold like hotcakes :).
Yeah, the 9800 GT was a
Yeah, the 9800 GT was a rebranded 8800 GT, most of the time all that was done was a slight clock increase, later versions included a die shrink (65nm -> 55nm).
The 8800 GT and its bigger brother 8800 GTS 512 MB (which was rebranded to be the 9800 GTX), really should never have been. I think NVIDIA launched them just in time for the holiday season that year, they were only on the market for about 3 months before they were rebranded. >.< But never the less, the 8800GT's/9800 GT's were an insanely popular gaming card, and rightfully so I think.
I loved my 8800GTS 512!
I loved my 8800GTS 512! Nvidia did some really confusing stuff back then! Remember this?
8800GTS 640gb, 8800GTS 320gb, 8800GTS 512mb (g92).
The 512mb was way faster than the other 2.
Then the 8800gts 512mb was rebranded as the 9800GTX (mentioned as ).
Then rebranded as the 9800GTX+ (was it 45nm then? cant remember)
Then rebranded as GTS 250.
What a trip!
Haha, yes I do remember the
Haha, yes I do remember the fun with GeForce 8 branding/marketing back then.
Yup, there is alot I could go into about those cards (Ive been a computer hardware geek since the GeForce 3 Days, and video cards are my fav parts, lol).
Yes the 9800 GTX was rebranded to be the 9800 GTX+ which was the die shrink from (65nm -> 55nm).
45nm was skipped on Video Cards/GPU’s mainly due to area density/manufacturing timing issues.
So that same G92 core went from 8800 GTS 512 MB -> 9800 GTX -> 9800 GTX+ -> GTS 250 ! How time flys ^.^
Thanks for the trip down
Thanks for the trip down memory lane with the 9800GT. Identical to the earlier 8800GT except for a BIOS flash for that fancy 9 Series number.
Best GTX 660 review period,
Best GTX 660 review period, thank you for your hard effort.
Thanks for your feedback
Thanks for your feedback everyone. Let me know if you have any questions or suggestions!
At the risk of asking you
At the risk of asking you guys to do more work ^.^, but in these video card reviews, can you include a few more video cards as comparison sake?
Rather than only the respective cards immidiate competition or siblings?
Say in this case, adding a GTX 670, GTX 680, HD 7970, HD 7950, GTX 580, GTX 570, ect. Seeing how cards compete across and inside generations is really nice/helpful. And, unless the testbed has changed, you could almost imput the benchmarks from previous reviews (though a “quick” re-run of the cards with updated drivers would be great).
Id be happy to help ;), lol.
No way in hell would skyrim
No way in hell would skyrim bring a gpu to 27 fps min at 1680
fix your fucking test rigs damnit.
It happens during a scene
It happens during a scene transition – going from a town to the overworld.
then your “Minimum” result is
then your “Minimum” result is fundamentally WRONG.
They aren’t wrong, they
They aren't wrong, they represent the system as a whole anyway.
are you guys using a older
are you guys using a older version of Skyrim for the tests?
Skyrim until patch 1.4 (I think) was much worse in terms of CPU performance, since all cards are doing 27 min this can only be caused by something else.
Latest Steam updates…
Latest Steam updates…
The transitions scenes appear
The transitions scenes appear to be capped at 30fps, and aren’t part of gameplay. Including them in a benchmark for a videocard doesn’t seem very illuminating.
I will be refreshing when
I will be refreshing when people learn how FRAPs works, and why there are dips in the charting as a result of cut scenes. It will do it on any system. And while…no, it isnt particularly illuminating on benchmark data…it is a necessity that all cards are shown on the same play through, with the same cut scenes, and the same corresponding dips. Please get off your high horse.
One thing that i will take issue with however, is the Anti Aliasing settings. I am discouraged that nVidia reviews consistently limit the AA to a paltry 4x. And usually its MSAA not FSAA. It is pretty common knowledge that high levels of AA on nVidia cards, particularly at higher resolutions, is a weak point of their design. And typically show steeper performance drop off as the AA and resolution goes higher then ATi cards under similar conditions would.
For the sake of fair and balanced testing, the true maximum settings should be run on the game, as a typical user would select from the start. Not cherry picking AA levels that are favorable to one vendors hardware setup that greatly skew expected real world performance.
If you are going to bother with the new WQHD resolutions, the least you could do is run 8x FSAA.