Power Consumption, Temperatures and Noise
With all the power efficiency improvement talk that NVIDIA has been pushing on us since learning about the new Kepler GPU, we were eager to see how the actual power consumption actually turned out.
The GTX 680 is able to outperform the HD 7970 in most of our gaming titles while using less power – 29 watts less to be exact. And even more interesting this power consumption "peak" should be about the same on every game you play, if the GPU Boost technology is working like it supposed to. The Galaxy card power consumption looks basically identical to the reference design as well.
In terms of temperature the Radeon cards still take the lead here but keep in mind that the GeForce GTX 680 is always balancing the performance, power and fan speeds in near real-time so the 81-83C level is likely the highest you’ll see.
The good news keeps coming for NVIDIA as we see the GTX 680 is actually running quieter than the Radeon HD 7970 while running more efficiently. The new GeForce card is just a great overall product!
Time and time again,
Time and time again, reviewers fail to test every feature of Nvidia GPU’s. When will they ever benchmark 3D Vision performance? One of Nvidia’s high end features. Sigh…
I was hoping to see Battlefield 3 benchmarks for 3D Vision in single and SLI mode.
We have another piece looking
We have another piece looking at that, including Surround, come up next!
awesome, thanks 🙂
awesome, thanks 🙂
Sweet. As a 3d vision owner
Sweet. As a 3d vision owner I have to say, there is not many sites that measure features like this and it is always great to see sites covering 3d performance.
in every game in this review
in every game in this review there is a picture of in game options with 1920*1200rez, but in the test graph its 1920*1080
whats up with that?
Game testing screenshots were
Game testing screenshots were taken before we moved from 1920×1080 testing. Nothing being hidden.
😀
still used GPU-Z 0.5.9?
still used GPU-Z 0.5.9? lol
http://www.techpowerup.com/downloads/2120/TechPowerUp_GPU-Z_v0.6.0.html
Yeah, testing was done a
Yeah, testing was done a while back. 🙂
Nice review Ryan. Wish I
Nice review Ryan. Wish I could afford a better card. I’ll stick with my 560ti for another year at least :).
I’m not impressed with
I’m not impressed with performance the 680gtx has over 580gtx.
The Min/fps performance doesn’t warrant buying a 680gtx!
but only for power consumption i’d buy it.
Right, which is what they are
Right, which is what they are going for. Why does somebody need more than a 580? No game maxes that card out yet, so why put something on the market which is an even more unnecessary leap in performance? Why spend a lot more money to see a number on the screen hop up a few pegs?
What about when GTX 680 hits
What about when GTX 680 hits $399.99…Offers 3+ monitor support out of the box, consumers less power, runs cooler, and outperforms your GTX 580? Also, why did you think buying the GTX 580 was a good deal in the first place? I know I didn’t. I bought a HD6950 2gb and unlocked it to a HD6970 for $300 or less, and it gets close to GTX 580 performance.
The main thing to remember (or learn if you didnt know) is that the GTX 680 is based on the GK104 GPU, not GK110 GPU. The GK110 GPU is a bigger, badder, faster GPU but is not available right this second.
Here’s an assumption – if you were Nvidia and you had something maybe 25-40%+ faster than the GTX 680, but the GTX 680 is already about 10-20%+ faster than the competition at a lower price, would you release the big one? If you want to make money, which is what businesses do, you would want to hide the badass one and make huge margin on the GK104 based card as long as you can.
Check out the comparison of GK104 Vs GK110.
http://www.legitreviews.com/news/12803/
The 580 does not max every
The 580 does not max every game out, plain and simple. There are a decent amount of gamers doing 3d, 3monitor, or both at the same time. I’m a big fan of Eyefinity/Nvidia Surround myself, and it was a bummer that the gtx 580 were not supporting it without 2 cards. GTX 680 fixes that and adds more memory – perfect.
Where iz thee SLI performance
Where iz thee SLI performance resultz?!?
http://i.imgur.com/XZkiq.jpg
and i was thinking i was
and i was thinking i was kicking ass with my watercooled GTX 580. some things never change.
I went over this website and
I went over this website and I conceive you have a lot of good info, saved to bookmarks (:.