Detailed Power Consumption Testing
Looking at power consumption on the mid-range cards became a much more important part of our testing suite after the issues surrounding the AMD Radeon RX 480. Let's see how the ASUS engineered GTX 1060 Turbo handles the pressure.
At 1080p with Rise of the Tomb Raider, the GTX 1060 Turbo from ASUS uses essentially the same total power draw as the Founders Edition card, which makes sense. Both are running at the same speeds, and even though the PCB designs aren't identical, they are likely very close in design and integration. The blue line averages out at around 115 watts or so with just a quick spike over 120 watts.
Under The Witcher 3 though, it looks different; the ASUS GTX 1060 Turbo is pulling more power than the Founders Edition. It is still under the 120 watt total TDP.
Next up we'll take a look at Metro: Last Light running at 4K – a worst case testing scenario for the GTX 1060.
To look at power and current draw from each source (6-pin external connection and the PCI Express bus on the motherboard) we look at the direct output from our inline power measurement system. What you'll see in the first graph is that power draw from the 6-pin connection and the motherboard are very similar – more closely related than I would have liked to see from ASUS. The motherboard slot is only pulling 60 watts at peak, with the 6-pin connection hitting 64 watts or so. In terms of current capacity, the motherboard slot stays JUST under the 5.5A specification limit.
When overclocked, the ASUS GTX 1060 Turbo is going over spec on the motherboard slot. We now see power draw hitting as high as 72 watts while current peaks at ~6A. That breaks spec, yes, but since we have pushed up the power target on the card to its maximum and increased clocks by a +240 MHz offset, we'll give both ASUS and NVIDIA the benefit of the doubt. Even when the RX 480 was breaking spec at overclocked settings after all software fixes were applied, we allowed it without criticism.
It's interesting to see how even the power draw is between the PCIe motherboard slot and the 6-pin connection on the GTX 1060 still. This isn't usually how NVIDIA and its partners have designed cards. Still, likely thanks to the 120 watt TDP target, nothing is goes out of spec when running at stock settings.









“ASUS has gone with a single
“ASUS has gone with a single DL DVI connection, two DisplayPort and two full-size HDMI. The idea here is that users can keep an HDMI monitor connected while also connecting an HDMI VR headset like the Rift.”
Ugh. A DP++ port is also natively a HDMI port using a passive (i.e. it does nothing more than rearrange the pins) adapter. By replacing a DP++ port with a HDMI port, you gain NOTHING but lose a DP port. It’s moronic.
I’m more annoyed about the
I’m more annoyed about the lack of DVI-I, because for higher resolutions, there simply are no active adapters for VGA.
And as an added bonus, you
And as an added bonus, you block a large portion of the exhaust vent!
Someone once said that DP-to-DVI adapters only give you a single link DVI port, not a dual link. I’m not sure if that’s still a restriction. That’s the only reasonable reason I could see to keep a dedicated port around.
IIRC, DP to DL-DVI adapters
IIRC, DP to DL-DVI adapters do exist (I think apple used them), but they were very expensive (around $100)
Great article Ryan!
Great article Ryan!
Any chance you can take the
Any chance you can take the shroud of of that thing so we can see the heatsink? I’m wondering how this card got up to 80c in the temperature test while the FE only hit 73c after extended testing. Did Asus cheap out on the heatsink?
I guess not? Do they not let
I guess not? Do they not let you take the card apart?
I am surprised not to see
I am surprised not to see Project Cars in the list of games that the card was tested.
I’d like to see PCPER test
I’d like to see PCPER test Project Cars too.
Why in particular?
Why in particular?
because you guys are nVidia
because you guys are nVidia shills.
you don’t even make the point that this card will fail hard in all upcoming DX12 games
shame on you
DX:MD
DX:MD
Oh, come on. You didn’t
Oh, come on. You didn’t started benchmarking graphics cards yesterday.
I think it’s the best looking
I think it’s the best looking racing game on PC and runs across a wide variety of hardware. It’s available for VR, too.
Will the fan be idle under 0
Will the fan be idle under 0 load or is it still spinning?
It still spins, but very
It still spins, but very slowly and pretty much silently.
When my machine idles, the
When my machine idles, the ONLY thing I hear is the gtx 1060 turbo. I wouldn’t call this a silent card. If you have other fans running at more than 1000rpm, you won’t hear it anymore of course.
Could you do all graphs with
Could you do all graphs with a white background? The ones with the black background were illegible. ;( Please?
Hi Ryan (or any other staff
Hi Ryan (or any other staff member)! Is this a good card for 1080p gaming for the next 3 years? Or should i save for the 1070?
you should buy the RX 480 –
you should buy the RX 480 – nVidia Pascal will fail hard in all new DX12 games
Why?
And you mean all 10 of
Why?
And you mean all 10 of them that will be released in the next 5 years? Lol i just built a new pc with a 1070 and windows 8.1 because i dont want the spyware in 10. Fuck dx12.
I agree about dx 12 unless
I agree about dx 12 unless they offer it for win 8.1. It might wither away like dx 10 did. That’s why AMD is buying up every new game and adding dx 12 support coded solely their way with async. Over a year and what no designed from the ground up games yet. There are only badly coded ones added with patches.
Dx 12 is a failure so far. They need to start work on dx 13 that shouldn’t be biased like the last 3 were for AMD. DX 10,11, and 12. Solely because of Xbox console.
Did I see the same graphs?
Did I see the same graphs? This card is clearly better than a 480 even on TR dx12
“nVidia Pascal will fail
“nVidia Pascal will fail hard” – NONSENSE. They’re the most high end cards on the market. My 1080Ti does not “fail hard”. LMAO.
Nvidia cards of this price
Nvidia cards of this price range are NEVER made to last 3 years. If you don’t mind lowering settings sooner, rather than latter, this card will be OK. NOT EVEN THINK about saving some more and getting a 3GB version.
Now, because you are talking about 3 years, I would suggest the GTX 1070. It is overkill today for 1080p, but you will never have to go to settings and lower any of them for that period of time. And if in 2-3 years you start thinking of a higher resolution monitor, the GTX 1070 will not be a reason to stay at 1080p.
Never, you say?
I’m using a
Never, you say?
I’m using a 4-year-old EVGA GTX660 (2GB) card to drive a monitor with native 1080p resolution, and I paid around $240 for it new.
It runs a STEP:Core installation of Skyrim Legendary at an average 58fps framerate. You could make the argument that it’s an old title, but you’d be dodging the point: it’s a DX11 game (I run Win7x64) and with the mods, it pushes video HW as hard as anything.
Interesting choice of games.
Interesting choice of games. I wonder how the Saphire Nitro+ 480, the non ref card you had for review for some time now, would compare? It doesn’t throttle and is cool and relatively quiet.
No hitman, no Doom, no Gears
No hitman, no Doom, no Gears of War. Another biased article. How much was the bung from nvidia this time?
For this review, since we
Page four, paragraph seven. Read the 1060 Founders Edition for those game results.
“So, as a result, our testing
“So, as a result, our testing suite has been upgraded with a brand new collection of games and tests. Included in this review are the following:
3DMark Fire Strike Extreme and Ultra
Unigine Heaven 4.0
Dirt Rally (DX11)
Fallout 4 (DX11)
Gears of War Ultimate Edition (DX12/UWP)
Grand Theft Auto V (DX11)
Hitman (DX12)
Rise of the Tomb Raider (DX12)
The Witcher 3 (DX11)”
What, where, when.
Where is Doom, Hitman, Gears
Where is Doom, Hitman, Gears of War, fallout 4 and BF4 in this review?
For this review, since we
Yes, they skipped the bad
Yes, they skipped the bad ones.
Are we still waiting for
Are we still waiting for official Nvidia Vulcan support for Doom? I feel like the test suit really needs a fast paced FPS given:
1) How popular that genre is
2) Aren’t fast FPS games the genre where smooth visuals matter the most?
I feel like Hitman could be dropped in favor of a FPS and no one would care.
The problem with that is that
The problem with that is that fast-paced FPS’s hardly exist anymore. Compared to the original Doom, playing Doom 4 feels damn empty, even on the highest difficulty.
Typical AMD fanboy responses.
Typical AMD fanboy responses. They won’t be happy unless all the benchmarks used are from AMD Gaming Evilved games.
It is a review of an Nvidia video card. One has to know how it will perform on Games works games or neutral games. Mostly not AMD games that they probably have little interest in buying because of poor coding for Nvidia cards.
‘you get reference level
‘you get reference level performance at the lowest available price and you still get the promises of quality’
It’s all well and good as long as you don’t mind having a friggin’ leaf blower in ur rig…lol