Power, Temperature, and Overclocking
Power and Temperature
The HD 7970 introduced a new feature called “Zero Core” which basically shuts down the majority of the GPU when the screen goes into sleep mode. This allows resting power to be around 3 watts for the entire video card. The use of TSMC’s 28 nm process has also helped in that even though this is a larger chip, the finer process with more advanced materials technology enables the chip to pull less power at idle and load.
With the screen still on we see the R7970 having the lowest power draw of the products listed here. It is even slipping below the HD 6800’s at idle. Once loaded we do see some significant changes, but of the Lightning cards the R7970 is still the lowest powered option. When the screen is off, an extra 13 watts is saved by the R7970 at idle.
In terms of heat, the Lightning series have seen some of the best results for high end cooling solutions. The upgraded Twin Frozr IV cooler certainly has helped to keep temperatures low. How low? Let’s see.
The other three units are Twin Frozr III products, but the IV seems to give it a slight edge in the Lightning race. The lower clocked, partially disabled HD 6950 is the only one that can match the peak temp of the R7970.
Overclocking
My overclocking experience was initially disappointing. I was only able to get the card to 1090 MHz stable on the GPU. The memory was pretty recalcitrant as well, and it would show issues anywhere above 1400 MHz. Once the new BIOS was in place, the GPU Reactor re-installed, and the card wiggled vigorously in its slot, things improved. I was able to hit 1170 MHz core and 1500 MHz (6000 effective) memory with the GPU voltage up at 1.218 volts. Without voltage increases the GPU was able to go to 1120 MHz stable.
The overclocking was done with the GPU Reactor installed (results were about 15 MHz slower when uninstalled). MSI’s latest Afterburner Beta 14 was used, as it fully supports the new R7970 Lightning.
I was somewhat disappointed in the particular sample that I received. The initial issues could indicate that it is a temperamental and possibly flaky card. Not every card comes out perfect, and though it probably passed all the tests at the factory, usage in a real world environment exposed its flaws. Again, I have seen other examples of this card clock much higher than mine, and not have the same initial problems that I experienced. There is no guarantee when overclocking, and I initially came out on the wrong end of that one. Through some patience and work, I was able to get this card running about where it should be. I think I missed out on some of the potential, but at least it turned into a stable and fast card.
wow, didn’t even test it
wow, didn’t even test it agains another 7970 really guys, time to get on the ball.
If you’re looking at this
If you’re looking at this card and the premium it carries, its likely you’re buying it for the overclocking headroom, not the base specs.
What I would have thought was more important is this vs a 680. I can understand not putting it in though. Its unlikely there was one available for review at the time this was being put together.
Gotta work with what I have.
Gotta work with what I have. The performance of a standard HD 7970 is not exactly a secret, so I decided to test it against the two previous Lightning cards to really detail what a user gets when upgrading to this particular overclocked monster. In hindsight I guess it would have probably behooved me to lower the clocks on this card to standard settings and gone from there. I will certainly keep that in mind next time I test an overclocked product like this. Also, Ryan is in Kentucky with the standard HD 7970s, and I live in Wyoming. Swapping parts between the two areas is a bit troublesome.
So do you think it really is
So do you think it really is worth the mark up in price?
For the $50 increase in price
For the $50 increase in price over a stock? Yes, absolutely. But you must remember that this is a brand new product, and the GTX 680 is still not out in force. Once that happens, then I am sure the dynamics of the pricing of these cards will change drastically. I am judging this card by what is available today. So yes, at $599 it is a good card. Two months from now, when there are many different examples of not just HD 7970 cards, but also GTX 680… it might not look like such a nice product at that price. I am pretty sure though that prices will drop pretty dramatically during that time to keep it competitive with other offerings.
were you able to take the
were you able to take the heatsink off and find out if it is on a reference pcb?
It most certainly is not a
It most certainly is not a reference PCB. A reference board has a 5+1 power phase setup (iirc), while this one is 17 total phases. If you look at the pictures of the boards from behind, you can see that the PCB is smaller at the front of the card (display outputs), then gets taller after the CrossFire connectors. It is also longer than the reference design. This is a much larger PCB to accomodate the more power phases, as well as give the necessary room to optimize trace pathways to the different components.
Who cares I just bought a
Who cares I just bought a GTX680 Son!
A good buy! I just hope we
A good buy! I just hope we get to see more available GTX 680 products soon!!!
BUT although the 680
BUT although the 680 outperforms on a SINGLE monitor application, it would take 2 680’s to do an eyefinity setup.
Advantage AMD
Nope, the new GTX 680 can
Nope, the new GTX 680 can output to 4 monitors in total with one card. It will only do 3 monitors in NVIDIA Surround with the 4th being an “accessory monitor” when using 3D applications. So, users no longer require 2 NVIDIA video cards in SLI for more than 2 monitors.
Great article, thanks! I am
Great article, thanks! I am positive that there will be a 680 version, they would be foolish not to make one.Look forward to reading about it too
I also have to agree, the 50$ premium for better quality parts on the card is well worth it.
I bought a R6970 Lightning just for the quality parts.
Obviously I haven’t been
Obviously I haven’t been given a timeline for the eventual GTX 680 Lightning card, but I would expect it to be around 3 months away due to the shortage of chips and the design time for the product as a whole.
You don’t think they got some
You don’t think they got some chips already?
Or are you expecting it will be on the market by then?
GTX 680 chips are scarce.
GTX 680 chips are scarce. TSMC shut down the 28 nm line in mid-February, and I am unsure if they have started it back up again. NVIDIA got a couple of complete chip shipments from them, but I think that until manufacturing starts up again, supply is going to be super tight. So tight that guys like Asus, MSI, and others will not have the amount of product on hand to create a second line of non-reference cards.
“GTX 680 chips are scarce.
“GTX 680 chips are scarce. TSMC shut down the 28 nm line in mid-February, and I am unsure if they have started it back up again.”
I doubt the GTX 680 could of been released worldwide, albeit in short supply, if it was.
Well, the long and short of
Well, the long and short of it is… NVIDIA set a date for release assuming that TSMC would be able to continue to process wafers at a certain rate until that date. TSMC dropped all production after NVIDIA had set the release date. NVIDIA had enough product out to release the card and have some decent numbers in retail, but after that it would be touch and go. I have heard that the beginning of April will have more cards available than at launch, but the big question is availability after that. I guess time will tell, but from what I am hearing availability might be scarce for a while.
It’s really very complex in
It’s really very complex in this busy life to listen news on Television, thus I just use world wide web for that reason, and obtain the most up-to-date information.
my blog post: here are the findings
Thanks, I wanted someone to
Thanks, I wanted someone to face off BF: Bad Company 2 and Battlefield 3 so I could compare then side by side.
lololololo my gtx 570 scores
lololololo my gtx 570 scores a 7135 in 3dmark11 ATI is heading downnnnnnn hilllllllllllllllllllll
Are u mentally impaired? His
Are u mentally impaired? His test bed consisted of an AMD Phenom CPU.
Ofcourse the overall score is going to take a major hit on all cards tested.
Just got my 7970 Lightning
Just got my 7970 Lightning in, BAAARELY fits in my antec 1200 with 3 hdd’s (Had to move those).
It’s a thing of beauty this is. Kind of sucks Josh had such a horrible experience.
Is there any way to tell if the retail ones that I and others will receive have the updated bios?
I’ll have to check and see.
I’ll have to check and see. But if your card is working without issue, there is no real reason to flash the BIOS. VBIOS are typically much simpler than a motherboard BIOS.
Last question,
You said you
Last question,
You said you had some OC issues, what was the ASIC quality of your lightning card?
To find it, you go to GPU-Z, click on the upper left corner, and near the bottom it will have ASIC quality.
I know MSI has said in the past they don’t bin their cards, but it would appear they may for the lightning. Mine was 82.5% or right around there. The lower the number, the better the OC possibility.
Thanks again man.
So far, benchmarks are:
Alan Wake 12.9 -> 81 AVG FPS by the upgrade
RUSE went from 30 up to 160.
Insanity, I’ll try to OC it tonight, see how I do. Only the 2nd thing I have owned to OC it.
62.7%.
Bios is
62.7%.
Bios is 015.013.000.011.000000 (113-AD40900-X01)
82.4%
Bios:
82.4%
Bios: 015.013.000.011.000000 (113-AD40900-X01)
So yeah, same bios. Here is a portion of ASIC and OC results.
http://www.overclock.net/t/1196856/official-amd-radeon-hd-7950-7970-owners-thread
EDIT: After reading through the thread a bit again, there appears to be some sort of drop off between OC and ASIC Quality.
The lower number means higher voltage, higher potential for OC, but when you get to a point (say below 75 or above 95) there is a dropoff between OC and voltage heavily.
It will take some more looking into, but that is what I have seen for a bit now.
Remember those special Phenom
Remember those special Phenom IIs that were aimed at the LN2 crowd and only like 1000 of them were made? This seems to be along the same line of thought. Leakier, hotter running chips that take super cooling really well. On air cooling, not an impressive overclock… on LN2, the sky is the limit. So yeah, I would imagine my sample might do well under LN2.
It’s wierd. I was looking at
It’s wierd. I was looking at it tismorning. Stuff at around 65%, with 1175 and 1250 voltages. Got up to around 1250-1300, some on air, some on WC, but everything was drastically varied.
It just seems odd that these things with the same “quality” don’t have the same characteristics. But then again, when you have 40+% leakage, it is a lot of heat in such a small area. I know for me I have to re-wire my case and move the HDD up to the top to free up the bottom two 1200 intakes for the GPU. Working on that this weekend, but yeah.
Thanks the insight man. I’ll throw up something and try to OC for sure.
I was thinking about the bios
I was thinking about the bios thing. I can combare the bios HEX to mine and get the version based on that. Can you post or add a GPU-Z screenshot of the MSI 7970 Lightning?
Appreciate it.
Right now I’m at 1180 Core
Right now I’m at 1180 Core Clock and 1440 on the memory.
If I could add voltage to the memory then I would be able to up that as well, but right now It’s locked? I read a post that said to mess with the powertune settings to 20%, but the memory voltage didn’t change. Perhaps I need to switch to the LN2 bios selection?
EDIT: Had to back things off a bit, way too much voltage I’m guessing, but ended with 1175/1435.
http://67.205.124.70/c/3/10dcc710-3441-468b-8f64-749fedcc0d3f.png
There is a CFG setting you
There is a CFG setting you have to manually put in with Afterburner to get the memory and PCI-E bus to over volt.
http://forums.overclockersclub.com/index.php?showtopic=182403
Surprised the memory doesn’t go any higher than that for you. Also, put power tune to + 20%. That essentially increases the amount of available power to the GPU. Powertune was put in place so in corner cases like Furmark, the board/chip would not exceed the rated TDP (and shut down).
Nevermind. I can OC the
Nevermind. I can OC the memory now. In the msi utility, to the right of the core voltage there is a small arrow, hit that and you can get the other 2 voltage settings.
I got it up to 1220/1520 The
I got it up to 1220/1520 The memory is about done. Even 5 MHz and I get corruption. I don’t know whether to add a ton more voltage (I have slider space for 75 mV). I tried 1240 on the core, but got some severe image corruption.
I think I will try some stability testing, just let it run for an hour or so instead of 5-10 loops on the metro benches, and see if anything ends up happenings in terms of corruption. As far as voltage goes, I’m not sure if there is a “less is more” type approach, or if you simply add more when you reach corruption.
From my Computer Engineering background I know a bit about how the ripple and stuff affects everything, and I’m not sure just how much ripply is being introduced and so forth, but, needless to say…
925 -> 1220 = 31.89% OC
1375 -> 1520 = 10.55% OC
Wow, not a bad overclock at
Wow, not a bad overclock at all. I had overclocked the one I have to 1100 MHz… but I was using Oblivion to play, so I don’t know if the vid card was causing problems or that rather unstable game was…
I’m betting the game.