Clock Speed Consistency and Overclocking
With very little technically changing between the ASUS GTX 1060 Turbo and the Founders Edition, other items need to be checked on. First up, does the cooler on the ASUS card get the job done? Can it keep the GPU cool enough to run at reasonable temperatures and to allow the GPU to work at the expected clock speeds?
ASUS includes a specific utility for its GTX 10-series of graphics cards that allows the user to select between three different modes that it can be run: Gaming, OC and Silent. Moving between them changes the default base and boost clock speeds, but very little else.
Gaming mode (Default)
OC Mode
Silent Mode
A quick glance at GPU-Z while alternating between these modes shows only one change being made: the clock speed shifts by 20 MHz. That's not much and honestly it won't make a noticeable impact in either your gaming performance or the noise levels the card is going to produce. Only worry about this if you need the easiest possible solution to get a smidge of additional performance out of your new card.
If you really want to dive into the world of overclocking, and you probably will want to, grab the latest version of GPU Tweak (I was using v1.3.2.2) and go to the custom section where you can access controls for the GPU clock offset, memory clock offset, fan speeds power target and more.
Click to Enlarge
Running at stock settings, under the Gaming profile to be precise, the GTX 1060 Turbo from ASUS was able to stabilize at a clock speed of 1815 MHz after 10 minutes of Unigine Heaven looping. That's a great deal higher than the 1709 MHz "typical" rated Boost clock on the GTX 1060. The GPU itself sits comfortably at 79C, well within the boundary to prevent any kind of clock degradation or throttling.
Click to Enlarge
After doing some overclocking, which is simple with the GPU Boost options built into the NVIDIA GeForce infrastructure, I was able to push the ASUS GTX 1060 Turbo up to a +241 MHz offset. That gives us a base clock of 1747 MHz and a Boost clock of 1950 MHz. That's a 16% increase in GPU clock with very little effort on my part!
The net result is an average clock rate that settles in over 2080 MHz while the GPU temperature goes up to 80-81C under a full load. Obviously the fan is spinning a bit more here and increases the noise output of the card; but based on my sound level meter it only went up by 0.2 dbA.
No matter how you look at it, these are impressive stock and overclocked results – the ASUS GeForce GTX 1060 Turbo 6GB card is able to provide a high level of gaming performance and overclockability despite its lower cost and using what many consider a cut-rate cooler. Looks like it's going to be harder and harder for custom cooled add-in cards to really stand out.
“ASUS has gone with a single
“ASUS has gone with a single DL DVI connection, two DisplayPort and two full-size HDMI. The idea here is that users can keep an HDMI monitor connected while also connecting an HDMI VR headset like the Rift.”
Ugh. A DP++ port is also natively a HDMI port using a passive (i.e. it does nothing more than rearrange the pins) adapter. By replacing a DP++ port with a HDMI port, you gain NOTHING but lose a DP port. It’s moronic.
I’m more annoyed about the
I’m more annoyed about the lack of DVI-I, because for higher resolutions, there simply are no active adapters for VGA.
And as an added bonus, you
And as an added bonus, you block a large portion of the exhaust vent!
Someone once said that DP-to-DVI adapters only give you a single link DVI port, not a dual link. I’m not sure if that’s still a restriction. That’s the only reasonable reason I could see to keep a dedicated port around.
IIRC, DP to DL-DVI adapters
IIRC, DP to DL-DVI adapters do exist (I think apple used them), but they were very expensive (around $100)
Great article Ryan!
Great article Ryan!
Any chance you can take the
Any chance you can take the shroud of of that thing so we can see the heatsink? I’m wondering how this card got up to 80c in the temperature test while the FE only hit 73c after extended testing. Did Asus cheap out on the heatsink?
I guess not? Do they not let
I guess not? Do they not let you take the card apart?
I am surprised not to see
I am surprised not to see Project Cars in the list of games that the card was tested.
I’d like to see PCPER test
I’d like to see PCPER test Project Cars too.
Why in particular?
Why in particular?
because you guys are nVidia
because you guys are nVidia shills.
you don’t even make the point that this card will fail hard in all upcoming DX12 games
shame on you
DX:MD
DX:MD
Oh, come on. You didn’t
Oh, come on. You didn’t started benchmarking graphics cards yesterday.
I think it’s the best looking
I think it’s the best looking racing game on PC and runs across a wide variety of hardware. It’s available for VR, too.
Will the fan be idle under 0
Will the fan be idle under 0 load or is it still spinning?
It still spins, but very
It still spins, but very slowly and pretty much silently.
When my machine idles, the
When my machine idles, the ONLY thing I hear is the gtx 1060 turbo. I wouldn’t call this a silent card. If you have other fans running at more than 1000rpm, you won’t hear it anymore of course.
Could you do all graphs with
Could you do all graphs with a white background? The ones with the black background were illegible. ;( Please?
Hi Ryan (or any other staff
Hi Ryan (or any other staff member)! Is this a good card for 1080p gaming for the next 3 years? Or should i save for the 1070?
you should buy the RX 480 –
you should buy the RX 480 – nVidia Pascal will fail hard in all new DX12 games
Why?
And you mean all 10 of
Why?
And you mean all 10 of them that will be released in the next 5 years? Lol i just built a new pc with a 1070 and windows 8.1 because i dont want the spyware in 10. Fuck dx12.
I agree about dx 12 unless
I agree about dx 12 unless they offer it for win 8.1. It might wither away like dx 10 did. That’s why AMD is buying up every new game and adding dx 12 support coded solely their way with async. Over a year and what no designed from the ground up games yet. There are only badly coded ones added with patches.
Dx 12 is a failure so far. They need to start work on dx 13 that shouldn’t be biased like the last 3 were for AMD. DX 10,11, and 12. Solely because of Xbox console.
Did I see the same graphs?
Did I see the same graphs? This card is clearly better than a 480 even on TR dx12
“nVidia Pascal will fail
“nVidia Pascal will fail hard” – NONSENSE. They’re the most high end cards on the market. My 1080Ti does not “fail hard”. LMAO.
Nvidia cards of this price
Nvidia cards of this price range are NEVER made to last 3 years. If you don’t mind lowering settings sooner, rather than latter, this card will be OK. NOT EVEN THINK about saving some more and getting a 3GB version.
Now, because you are talking about 3 years, I would suggest the GTX 1070. It is overkill today for 1080p, but you will never have to go to settings and lower any of them for that period of time. And if in 2-3 years you start thinking of a higher resolution monitor, the GTX 1070 will not be a reason to stay at 1080p.
Never, you say?
I’m using a
Never, you say?
I’m using a 4-year-old EVGA GTX660 (2GB) card to drive a monitor with native 1080p resolution, and I paid around $240 for it new.
It runs a STEP:Core installation of Skyrim Legendary at an average 58fps framerate. You could make the argument that it’s an old title, but you’d be dodging the point: it’s a DX11 game (I run Win7x64) and with the mods, it pushes video HW as hard as anything.
Interesting choice of games.
Interesting choice of games. I wonder how the Saphire Nitro+ 480, the non ref card you had for review for some time now, would compare? It doesn’t throttle and is cool and relatively quiet.
No hitman, no Doom, no Gears
No hitman, no Doom, no Gears of War. Another biased article. How much was the bung from nvidia this time?
For this review, since we
Page four, paragraph seven. Read the 1060 Founders Edition for those game results.
“So, as a result, our testing
“So, as a result, our testing suite has been upgraded with a brand new collection of games and tests. Included in this review are the following:
3DMark Fire Strike Extreme and Ultra
Unigine Heaven 4.0
Dirt Rally (DX11)
Fallout 4 (DX11)
Gears of War Ultimate Edition (DX12/UWP)
Grand Theft Auto V (DX11)
Hitman (DX12)
Rise of the Tomb Raider (DX12)
The Witcher 3 (DX11)”
What, where, when.
Where is Doom, Hitman, Gears
Where is Doom, Hitman, Gears of War, fallout 4 and BF4 in this review?
For this review, since we
Yes, they skipped the bad
Yes, they skipped the bad ones.
Are we still waiting for
Are we still waiting for official Nvidia Vulcan support for Doom? I feel like the test suit really needs a fast paced FPS given:
1) How popular that genre is
2) Aren’t fast FPS games the genre where smooth visuals matter the most?
I feel like Hitman could be dropped in favor of a FPS and no one would care.
The problem with that is that
The problem with that is that fast-paced FPS’s hardly exist anymore. Compared to the original Doom, playing Doom 4 feels damn empty, even on the highest difficulty.
Typical AMD fanboy responses.
Typical AMD fanboy responses. They won’t be happy unless all the benchmarks used are from AMD Gaming Evilved games.
It is a review of an Nvidia video card. One has to know how it will perform on Games works games or neutral games. Mostly not AMD games that they probably have little interest in buying because of poor coding for Nvidia cards.
‘you get reference level
‘you get reference level performance at the lowest available price and you still get the promises of quality’
It’s all well and good as long as you don’t mind having a friggin’ leaf blower in ur rig…lol