Rise of the Tomb Raider and The Witcher 3 at 1080p
Power Testing Results – Rise of the Tomb Raider at 1080p
If Metro: Last Light at 4K was our worst case scenario, I wanted to look at a couple of “normal” cases, the same ones we measured in our initial power story. Starting with Rise of the Tomb Raider at 1080p and the High image quality preset, how do the new 16.7.1 driver changes affect it?
Rise of the Tomb Raider (1080p) power draw, RX 480, 16.6.2 driver, Click to Enlarge
Rise of the Tomb Raider (1080p) power draw, RX 480, 16.7.1 driver, Click to Enlarge
Without enabling the Compatibility mode switch in the driver, power consumption changes look very similar to what we saw in Metro at 4K. The total power consumption is actually a few watts higher with the new driver, hitting 157 watts on average compared to 154 watts with 16.6.2, but the big change is in the source of that power. The PEG slot drops average power draw down to 66-67 watts while the 6-pin picks up the slack and jumps to nearly 90 watts. That new separation between the white and blue lines of power draw is what you should be paying attention to – that demonstrates the new weighting of power phases to the GPU.
Rise of the Tomb Raider (1080p) current draw, RX 480, 16.7.1 driver, Click to Enlarge
Looking at the current ratings for each power source with the new 16.7.1 driver we clearly see that the PEG slot drops, pulling the same 5.75A in Rise of the Tomb Raider as we saw in Metro: Last Light at 4K. This is significantly closer to the rated 5.5A mark.
What happens if we enable that Compatibility Mode switch?
Rise of the Tomb Raider (1080p) power draw, RX 480, 16.7.1 driver (Compatibility ON), Click to Enlarge
Flipping that switch for compatibility mode to ON we see an overall drop in power consumption; we are now under 150 watts on average! This also marks the first time we see the power draw from the PCI Express slot under 65 watts. 6-pin power consumption is still higher than 75 watts (hitting 84-85 watts) but is still a much more reasonable level consider the buildout of the hardware.
Rise of the Tomb Raider (1080p) current draw, RX 480, 16.7.1 driver (Compatibility ON), Click to Enlarge
Current draw with this mode enabled on the PEG slot +12V line is now under 5.5A (!!) for the first time. The 6-pin connection is pulling over 7A, though with each of the two +12V connections rated at 8A (physical connection rating, not PCI Express rating), I feel more than comfortable with that swap.
Rise of the Tomb Raider (1080p) | 16.6.2 | 16.7.1 | 16.7.1 (Compat ON) |
---|---|---|---|
MB Slot (PEG) Power | 72 W | 66 W | 63 W |
MB Slot (PEG) Current | 6.2 A | 5.7 A | 5.4 A |
6-pin Power | 76 W | 89 W | 84 W |
6-pin Current | 6.4 A | 7.6 A | 7.0 A |
Power Testing Results – The Witcher 3 at 1080p
My final data set for power consumption today comes from The Witcher 3, still at a more standard resolution of 1080p. Image quality setting are set to Ultra (with Hair Works disabled throughout).
The Witcher 3 (1080p) power draw, RX 480, 16.6.2 driver, Click to Enlarge
The Witcher 3 (1080p) power draw, RX 480, 16.7.1 driver, Click to Enlarge
As I saw with the Rise results above, I see just a few watts more power consumption with the new driver in TOTAL power draw; up to 160 watts from 155 watts. But the split of power draw from the two different sources continues to be improved, with the PEG slot pulling less than 70 watts on the +12V line. The 6-pin connection increases its power draw to 90+ watts.
The Witcher 3 (1080p) current draw, RX 480, 16.7.1 driver, Click to Enlarge
Current draw in The Witcher 3 looks very similar to Rise of the Tomb Raider as well: the PCI Express slot is now drawing 6A.
What happens if we enable that Compatibility Mode switch?
The Witcher 3 (1080p) power draw, RX 480, 16.7.1 driver (Compatibility ON), Click to Enlarge
With the compatibility mode enabled we see total power draw in The Witcher 3 fall to 152 watts total and 64 watts from the PCI Express slot itself. This marks another case where the power draw is now under the recommended limit from the PCI-SIG.
The Witcher 3 (1080p) current draw, RX 480, 16.7.1 driver (Compatibility ON), Click to Enlarge
Finally, looking at the added benefit of the compatibility mode on current draw, the PCI Express slot is pulling right at 5.5A, with the 6-pin connection hitting 7.2A or so.
The Witcher 3 (1080p) | 16.6.2 | 16.7.1 | 16.7.1 (Compat ON) |
---|---|---|---|
MB Slot (PEG) Power | 74 W | 68 W | 64 W |
MB Slot (PEG) Current | 6.5 A | 5.9 A | 5.6 A |
6-pin Power | 76 W | 90 W | 85 W |
6-pin Current | 6.4 A | 7.7 A | 7.2 A |
great investigation by
great investigation by pcper
great response by AMD
this is how it should be
Exactly. Just imagine how
Exactly. Just imagine how much easier it would be to do our jobs if more than half of the comments were not accusations by fanboys. We reported on a thing. They fixed a thing. Clearly it was a thing. End of story.
when you stop the bias toward
when you stop the bias toward Nvidia maybe the accusations will stop also.
Still the accusations of
Still the accusations of Nvidia bias. You guys are one of the least biased sites in my opinion. Keep up the good work. I guess since I prefer Nvidia my opinion is biased. I don’t hate AMD like AMD fanboys hate Nvidia. Oh well.
Keep your chin up Allyn. Remember that it isn’t the sword that kills the trolls it’s knowledge that does it. Unshakable truth makes the trolls wither. To be honest you’ll probably have to do PEG analysis of every card from now on to appease fanboys from viewing you as biased. When the 1060 comes out, they’re going to demand it anyway.
nvidia user saying you’re no
nvidia user saying you’re no biased it must be true.
We were doing it for every
We were doing it for every card any way, but didn't focus on it specifically in the 480 review (but remember we are supposedly just so set on making them look bad- oh noes!!). We'll keep doing what we were doing.
You are correct Allyn. Being
You are correct Allyn. Being a new user I didn’t know this methodology was already your staple. I went back and saw that you used it for 1080 review. The 1080 being 2x more efficient in fps/watt than a Furyx. Wow.
Good to know that you are looking out for everyone’s hardware and helped head off a potentially costly result of using Rx 480.
Did you guys get many thanks for averting “disaster”? Nope. You got called Nvidia biased, fanboys and trolls. Some people can’t see the big picture. Thanks guys at PC Per for saving AMD users’ hardware.
No one should have to replace their motherboard unless they want to. AMD fanboys would justify it away by saying their stuff was old and needed upgrading anyway.
You guys must be mind readers to have put this type of testing into use well before Rx 480. Just because you knew AMD was going to violate the spec. LOL
Keep up the good work. I know you guys will regardless.
Tldr:kissing pcper’s and
Tldr:kissing pcper’s and nvidia’s @$$.
Too long didn’t read but yet
Too long didn’t read but yet you knew the gist of it. OK.
You don’t want to know the truth because it hurts.
Short enough for you to comprehend.
Wow you know ehat tldr
Wow you know ehat tldr means(you had to spell it out).and here i was thinking most nvidia users are morons./clap
I won’t comment on your
I won’t comment on your intelligence. If you’re an AMD fanboy it’s says plenty enough. Intelligent people buy the best they can afford.
AMD Fanboys blindly support a company who releases the same old wattage guzzling and heat blasting 4 year old tech (asynchronous compute and GCN) video cards and calling them new. LOL Doesn’t sound smart to me.
Motherboard PEG slot isn’t
Motherboard PEG slot isn’t maintain 12v stable. I see the problem in the motherboard, if motherboard maintain 12v stable the current will drop to 5.5A.
Just look the difference under load PSU and motherboard voltages. Anyone can test with another 6-pin connector adapted in PEG’s slot graphicas card to test?
Sorry, I know my english suxs. =/
More one driver and RX 480
More one driver and RX 480 will be sweet in power draw and performance!
I just built a brand new
I just built a brand new system. And I am excited about what I have received and built so far.
Intel i5 6600K 4.8Ghz “Silicone lottery”
Mini ITX Z-170 board
Raid 0 SSD’s
DDR4 3000 CAS 14
I built my system to be powerful! And normally I buy Nvidia cards, and there are reference cards that overclock very good! I use to own a GTX 660 OEM it’s default clocks were 864 Mhz core, and it had 1152 cuda cores. I overclocked the thing to 1,641 Mhz Core, and 8800 Mhz memory! It was faster than a GTX 770 at the time. I still have screen shots of it.
Now I have been building computers since the Nvidia 6800 Ultra was a beast, and ati 9800XT all in wonder too. Hearing those models brings back memories.
I get my AMD RX480 8GB tomorrow! I am super excited, I’m not concerned with power draw, as far as I know the 6 pin is wired exactly like a 8 pin. It has the same amount of 12v rails in the connector. So I do not think power is a limiting factor, as there is not communication between the 6 pin, and PCI-E power draw, there also not a functioning ground on the RX480 that even tells it that a 6 pin is plugged in! The card just has a hard time getting over 1,400 Mhz.
I do not think there is alot of AMD cards that reach over 1,400 mhz anyways.
Either way, I run a Intel CPU that I love and these RX480 cards are just phenomenal! In DX12 they just smash a gtx 980! And with the 8GB of vram, that we require in these games.
It is a great card for $200-$240 BF1 is about to come out, and it utilizes DX12 so this was a big factor for me in buying this video card. Enjoy maxing it out with a RX 480 GB.
The gtx 1070, and gtx 1080 are very fast! Ridiculous POWER . And they overclock even further to make them faster.
Gtx 1070 has proven to maintain about 42+ fps in 4K
Gtx 1080 is the first video card in our history to give you smooth 60+ 4K gaming frame rates. And once overclocked it’s even better. The thing is a beast! But, you pay for it.
If had the $400 I would have bought a GTX 1070 instead of a RX480, but I only had about $250 left in my build budget. And seeing the DX12 performance with async compute it makes me feel even better I did.
But, I’m not gonna talk junk about 1070, or 1080. Hell, I wish I had a GTX1080.
But, Nvidia is all about money. They want you to buy there newest product out, and once the next gen cards are out , they do not care about you anymore.
If you can afford to buy top end cards every 2 years then do it.
But, I’m only playing on 1080P right now and a RX480 is to much for that
Nvidia is selfish though I must say, the GTX1060 is only 3GB for $250? 3GB is not enough!! And $300 for 8GB 1060? Better off buying a gtx 1070. They were suppose to be $379.99 but, they have sold for $450 for so long now that $400+ has become normal MSRP.
Off-topic but… 1,641 Mhz
Off-topic but… 1,641 Mhz Core, and 8800 Mhz memory on an OEM GTX660 is very hard to believe. People rarely manage to squeeze 1300MHz Core on these cards. You say you still have a screenshot so I say pics or it didn’t happen!
Good write up. The rx 480
Good write up. The rx 480 should serve you well for 1080. 8 gigs is overkill though.
http://www.kitguru.net/components/ryan-martin/pny-gtx-950-2gb-and-gtx-960-4gb-xlr8-oc-gaming-review/6/
With weaker video cards 2 gbs is more than enough. 3 gigs ala 1060 should be sufficient especially if it is gddr5x like rumored. 4 gigs is usable in a select few games like AC Unity at 1080. Very little difference in benches for 950 2 GB vs 960 4gb. 960 is better overall card with double the ram but doesn’t outperform 950 by much.
In my opinion 8gb on rx 480 is a waste of chips for 1080. It’s only a selling point on future proofing. 4 gigs is plenty.
There’s no 3GB GTX 1060. That
There’s no 3GB GTX 1060. That was a false rumour and most likely it will be a GTX 1050.
So Ryan, while the fix of
So Ryan, while the fix of giving more time to 3 of the Power Phases does alleviate the problem, it also has the disadvantage of unbalancing the use of those 6 Power Phases.
So you end up with 3 Power Phases being used more, with more power going through them. – that can’t be good.
Designing from scratch you would know you are drawing more from the 6-PIN and would have 4 power phases wired into that option, leaving the other two for the lower use PCI.
In Conclusion, If I was in the market for this card, I’d not be wanting to end up with one of these first ones. Even if the “lower level” fix partner cards did was to reassigned the Power Phases, I’d still be concerned about the power draw.
On top of that – You have a manufacture totally misleading about the TDP of this card, a TDP number that WILL be used in the future to compare cards, and used in graphs from review sites etc, further misleading people about what they are actually getting
That doesn’t sit well with me. Sorry AMD, not this time… see you again next year.
The higher draw from the
The higher draw from the three 6-pin phases is still well within what those components can safely handle.
You can sue AMD for posting a
You can sue AMD for posting a misleading TDP, By the way, Nvidia did that too on their gtx 970..
seems that you build your system without power allowance?!
I was leaning initially
I was leaning initially towards Nvidia 1060 but AMD RX 480 seems quite a good deal.
Does Radeon RX480 support ZeroCore?
If so, will ZeroCore shut down RX480 when I switch my primary graphis to Intel IGP in bios?
The GNC architecture in
The GNC architecture in general supports Zerocore.
It’s GCN not GNC”Supplements”
It’s GCN not GNC”Supplements” ehehehe.. yeap.. GCN Architecture supports Zero Core technology
Best quote I’ve read:
Much
Best quote I’ve read:
Much ado about nothing.
Honestly, fix? To remain
Honestly, fix? To remain within spec the card would need to be clock locked. The FIX would be an 8 pin connector, a 4 & 6 pin connector, 2 – 6 pin connectors, a 6 and 8 pin connector or 2 – 8 pin connectors….
The solution to flawed hardware like this AMD card is to avoid it like the plague and if you must buy a video card now get a 1060, 1070 or 1080 and let AMD chips go in set top boxes. Mind you I have AMDs in this PC now and I almost picked up 2 of these ill fated cards. Thanks to PCPER I indeed see the light.
So did a few poor friends of
So did a few poor friends of mine. They waiting on the full reviews of the GTX 1060 but they overclock everything so they will definitely go for it as you can overclock to 2.1Ghz according to some leaked info and on overclock.net forums.
No need. Proper fix is to
No need. Proper fix is to move current 50-50 power distribution(3 phases for pcie slot and 3 phases to 6-pin) to more pcie connector.
There is hw design “flaw” with distribution: it’s no matter what connector you are slapping to it even 3*8pin connectors would still be distributed 50-50. AMD’s fix is routing power distribution towards 6-pin connector, luckily for them vrms is very good and solid and the used IR -controller can be programmed to tune power distribution more to 3 -phases that 6-pin connector uses(Max is 30-70 distribution) and they will easily withstand it.
Man I have to say the
Man I have to say the comments here make me sad from Nvidia and Amd fans. The conspiracy theories to the hate of either Nvidia or Amd is such childish nonsense.
First I’ll start off by saying the 480 Rx is an amazing card for the money.
Second since I’m a tech nerd I like to know which company has the better architecture and why and in some ways they can be better at some things and worse in others.
The 480rx performs like a 970 out of the box but has lower over clocking ability and uses the same amount of power despite being on 14nm.
This makes me come to the conclusion that Polaris is probably at the level of Maxwell in terms of performance per watt.Therefore Nvidia is still one step ahead.
Again however Amd has the price performance thing hands down but having that alone has never gotten them that far.
Must have design wins and that will be better and at that they got consoles but that’s about it.
This site lacks an edit
This site lacks an edit feature
Must have design wins and things will be better and at that they got consoles but that’s about it.
If you’re registered and
If you’re registered and verified, you can edit.
I though about 5% loss in
I though about 5% loss in real performance, but 3% max means pretty much nothing and newer drivers will probably take that lost performance back. AMD did great job fixing this if it even was problem at all. You should enable comp mode just so to lower the tdp for lower noise with almost 0 loss in performance.
Now sense this has been fixed it means no more whining on rx 480 power issues.
Now this is small step for AMD, but great leap for gaming community sense AMD actually fixed hardware problem unlike NVIDIA. I’m talking to you gtx 970.
Now you can hate me for what I said, but the facts are now out.
Yeah. no. it’s still not
Yeah. no. it’s still not fixed. My only option is to buy a power supply that has more, but I honestly don’t think the driver did anything for my pc. And if it didn’t it’s spiking somewhere near 250W. Not acceptable.
http://pcpartpicker.com/list/bnq8KZ
Why have you a link in the
Why have you a link in the article that links back to the article, so its linking to itself.
lol
Seems like Ryan is still not
Seems like Ryan is still not impressed?! lol!
Harry up custom 480’s I want
Harry up custom 480’s I want one.
Anyone actually got this
Anyone actually got this working on Win8.1? Loads cant and it seems AMD forgot 8.1 support on their website and driver?
Don’t need a new videocard,
Don’t need a new videocard, need a new AMD motherboard chipset that supports all the new fastest ports and stuff. I want to build an ITX system with their desktop CPU rather than an APU. Nvidia failed hard last time they tried to release a motherboard chipset, they’re a one-trick pony.
Google search and motherboard
Google search and motherboard am3 USB 3.1 and look for one in your price range.
So I got a Saphire RX 480 4gb
So I got a Saphire RX 480 4gb ($199 on amazon) and I’m playing with undervolting.
I think AMD botched the design in term of voltage levels required
Also AMD fan profile is way, way to slow to adapt to temperature.
When I start furmark, it starts at 1205mhz.. then start to drop after 30 seconds… as low as 875mhz… then slowly ramp back up.
It takes 1 full minute for the fan to reach its target speed to lower temp and for furmark to be rock steady again at 1205mhz.
The stock cooler is not so bad. It does reach 3200 rpm in furmark to keep 1205mhz… but in games so far its not even going to 2200rpm
What I dont get is that I can lower the voltage much lower then 940mv WHILE furmark is running (and I see the fan RPM drop)
but starting the computer with those voltage crash the driver.
It seem to me Polaris 10 could run at 1.2ghz and 910mv if the whole chip was balanced right. Because all the units seem to be happy at 1.2ghz and 910mv when stressed out.
So furmark 1.17 score is 4486 for the P1080 benchmark.
I haven’t checked, but gaming at 940mv VS 1150mv might be a HUGE power saving. I’m glad the card doesn’t get hot at all or loud with the stock cooler.
I still think of the
I still think of the possibility that Radeon RX480 was built for lower clock frequencies, but performance at the end fell short of what was expected from it at 150W power draw. Whether this belonged to the new 14nm-Process or the 4th-Gen-GCN, don’t know. But I think problems popped up at a point where it was too late to change board layouts to 8pin. So the card was pushed up to 1266 Mhz to overcome GTX 970 performance while the PCIE-Specs were left aside. With the new driver, AMD seem to have found a workaround for the initial performance lack/higher power draw. Late, but good. I think the card finally is a lot nearer to what was expected from the launch product. Nice!
Bryan at TechYesCity
Bryan at TechYesCity suggested that because this is a driver fix that it won’t take affect until after the card tries to boot and that some older motherboards simply won’t boot with the RX480.