In their testing [H]ard|OCP proved that the Windforce cooler is not the limiting factor when overclocking Gigabyte's GTX 1060 G1 Gaming G6, even at their top overclock of 2.1GHz GPU, 9.4GHz memory the temperature never reached 60C. They did have some obstacles reaching those speeds, the cards onboard Gaming mode offered an anemic boost and in order to start manually overclocking this card you will need to install the XTREME ENGINE VGA Utility. Once you have that, you can increase the voltage and clocks to find the limits of the card you have, which should offer a noticeable improvement from its performance straight out of the box.
"We’ve got the brand new GIGABYTE GeForce GTX 1060 G1 GAMING 6G video card to put through the paces and find out how well it performs in games and overclocks. We will compare its highest overclock with one of the best overclocks we’ve achieved on AMD Radeon RX 480 to put it to the test. How will it stand up? Let’s find out."
Here are some more Graphics Card articles from around the web:
- GTX 1070 Overclocking Guide @ OCC
- Arctic Accelero Hybrid III-140 GPU Cooler @ Kitguru
- OCC's Top 3 Video Cards of 2016 @ OCC
- Benchmarking Radeon Open Compute ROCm 1.4 OpenCL @ Phoronix
I’ll give a quick recap. This
I’ll give a quick recap. This 1060 beats the rx480 Strix at max overclock in everything except Vulkan Doom. RX480 has less performance generally and uses more wattage and has higher temps than the 1060 tested.
The difference is the 480 is ravenous for wattage and consumes 117 more watts at max OC !!!!! 44% more power is needed to get to nearly the same place frame ratewise. I’m being kind to the 480 here.
For simplicity assume both cards have equal performance. The 1060 costs $258 after rebate. The 480 isn’t going to be found for 44% less than the 1060.
Over a 3 year period the 480 is going to cost you $41 more for light gaming (4hrs/day) at cheap rate of .08 per 1000 watts used.
You would have to be a total zealot of AMD to buy the 480 over the 1060. Maybe if you live somewhere cold you’ll get added benefit out of it.
Not really a zealot to use
Not really a zealot to use the RX 480 at normal clocks and not worry about that OC the hell out if it. The RX480(at lower clocks) still has higher SP FP TFlops than the GTX 1060 and more titles are on the way with DX12/Vulkan support. The GTX 1060 really needs that max overclock to match/”Beat” the RX 480 but the RX 480’s price/performance metric is getting better the closer it gets to Vega’s release date, and running the RX 480 in its normal clock range under DX12/Vulkan will give that GTX 1060 more competition as time goes on and the games/gaming engine ecosystem moves fully onto using the new graphics APIs.
You do not even attempt to guesstimate what power the RX 480 may be using running at normal clocks under DX12/Vulkan with games/gaming engines that are tweaked to take advantage of the RX 480 features like shader intrinsics, primitive discard accelerators, and async-compute, and the entire gaming/gaming engine industry will be tweaking for DX12/Vulkan and putting the games/gaming engine makers in control over the GPU’s metal.
The RX 480/RX 470 SKUs are going to continue to become more affordable when the Vega SKUs start to arrive and Volta will not be coming to the consumer market until 2018 with Nvidia only getting its professional market Volta SKUs trickling out at the end of this year. I’d even guess that maybe some dual RX 470 configurations will have some very nice scaling figures once the DX12/Vega explicit multi-adaptor tweaking/development happens not by AMD/Nvidia but by the games/gaming engine industry and DX12/Vulkan API driven multi-graphics adaptor gaming that comes under the full control of the games/gaming engine makers.
Your GTX 1060 is not going to be able to be used in dual configurations if Nvidia has any say in things but the RX 480/470 SKUs are not even limited now by AMD for dual GPU/CF usage. And it’s going to be the very inexpensive RX 480/RX 470 SKUs after Vega arrives that Nvidia will have to compete with when the Vega replacements start to arrive in the form of some Vega based mainstream SKUs after the Vega flagship is to market. The RX 480/470 owners will continue to see their gaming performance improve long after the GTX 1060 tops out in the heavily OC regimen! There is some much extra compute built into the RX 480/Polaris micro-architecture just waiting to be tapped under Vulkan/DX12 that Nvidia’s Pascal will not be able to benefit from.
So any figures/benchmarks that you throw out will have to be revisited many times over the next year or so just to see if they hold up for any DX12/Vulkan gaming titles that will be released.
He’s just another anti-AMD
He’s just another anti-AMD zealot. Rest assured, if in this article the 480 had just barely outperformed the 1060 for slightly less wattage, he would STILL be in the comments, doing his best to bash it. He bases his entire sense of self-worth on whether or not Nvidia is better than AMD, so to bolster that false sense of superiority, he HAS to go out onto comment threads and spew garbage and personal insults.
It’s a sad, pathetic little compulsion from a sad and pathetic little boy. Facts and history don’t matter. Whenever you see a comment posted by “Anonymous Nvidia User” just imagine a post full of Nvidia fanboy propaganda, refuse to give him the attention he so desperately craves, and scroll past.
There are many anonymous
There are many anonymous posters and almost all of them are pathetic team red losers or Nvidia poseurs like the one with dual 980s.
If you enjoy your Radeon cards and are happy with it’s performance, price paid, and etc, more power to you. I am not directing any comments/insults to you. It’s the trolls who initiate personal attacks. If you like/enjoy AMD products and don’t attack a person I’m cool with that. We’re all fellow gamers/PC enthusiasts and that’s what matters.
The pathetic team red losers are the ones that constantly resort to personal attacks because I admit I use Nvidia and not attack any untruths posted (hint I don’t post BS figures or fudge numbers). My intent is not to deceive but to inform.
What part of what I posted was garbage and not true. Read the Hardocp review and show me where anything I posted is BS.
Zealot was a bit of a strong word maybe I shouldn’t have used. I’ll just leave it at that.
I don’t derive any self worth from having an Nvidia card either. I don’t feel I’m any better than a person running an AMD card. You’re the one who has to live with whatever purchase you make so I could really care less what cards anyone has. Both have their strengths and weaknesses.
FYI I don’t have to post. I haven’t posted very much at all for a while. Unlike you who feels compelled to reply and insult almost every post I make. Who needs to feel good about themselves? Hmmm.
How is anything I posted propaganda? I used facts unlike most posters. I even stated the 480 was better at Vulkan Doom.
Do not bother with AMD
Do not bother with AMD fanboys… they are nuts… they think that the RX 480 is better than GTX 1060 because RX 480 wins only in 4-5 AMD Games titles + Doom Vulkan… that`s all… for them no matter the hundreds of old and new games the Nvidia always win… nop sir… only that 4-5 games matter… too much stupidity in the Red Team side… always amuses me when i see stupid AMD fanboys comments… AMD always struggle to keep up with Nvidia even at low end – midrrange level… at high end level AMD no longer compete with Nvidia… Until AMD launch Vega… Nvidia got a long time to prepare Volta… and again AMD will lag behind Nvidia a generation…
And BTW what next DX12/Vulkan
And BTW what next DX12/Vulkan games ? I only see 2 DX12 games list for 2017 and no Vulkan games :))
https://en.wikipedia.org/wiki/List_of_games_with_DirectX_12_support
https://en.wikipedia.org/wiki/List_of_games_with_Vulkan_support
As i said… AMD fanboys are nuts… they always talk with no proof to back up… for developers to make games from from ground up they need time to learn how new APi`s works and how to make games with this new APi`s… is not easy… we will see DX12/Vulkan games from ground up 2-3 years later not now… now is too early to speak about this. So yeah… you all AMD fanboys can buy that RX 480 card to play 2-3 years later DX12/Vulkan games :))
This comment makes a very
This comment makes a very weak stance. The wikipedia links are only for games that have been confirmed by publisher/developer. We barely entered the 2017 year which still hasn’t even started the fiscal 2017 for many publishers to announce games for that year.
Even then many people buy GPU cards to last more than 10-12 months. Taking to account Nvidia’s track record of performance of their gpu cards; it’s very bad for more than a year support. Which is why AMD gpu cards have been a smarter better buy for instant and longer performance life.
The RX 480 performs better
The RX 480 performs better than the 1060(6GB) and much better than the handicap 1060(3GB). Over 3 years, the RX 480 is going to give better performance life than the 1060 ever will. Your arguing saving a couple dollars on electricity bill? Pathetic. Stick to consoles for saving electricity bills and buy yourself some candles to rethink your life at night.
I guess this good news for
I guess this good news for the Gigabyte owners!!
TWEAKING GAMES really goes a
TWEAKING GAMES really goes a long, long way to a better experience.
(**some people might want to COPY this for reference. It’s much lONGER than I’d intended at first so I apologize if it annoys anybody)
*MOST IMPORTANT PARAGRAPH TO ANY GAMER*:
You need to start with the GOAL of VSYNC ON, VSYNC OFF, Adaptive VSYNC, possibly asynchronous support if you are lucky enough to have a FREESYNC monitor for AMD GPU’s, or GSYNC for NVidia; then TWEAK the settings to reach that goal.
(FAST SYNC for NVidia only works properly if the FPS is at least 2x the refresh rate of the monitor. Thus at least 120FPS for 60Hz monitor. It works by creating each frame ASAP but only physically draws 60x per second like normal VSYNC ON. So no SCREEN TEAR, but where it differs is there is less lag/latency as it only uses the LAST completed frame prior to the time to draw the next frame so any button press, enemy approach etc is more up to date. It feels LESS sluggish but is NOT the same as 120FPS on a 120Hz monitor. It sits BETWEEN 60FPS/60Hz and 120FPS/120Hz in experience.)
FAST SYNC is nearly impossible for high-refresh monitors to even use. At 144Hz you need over 288FPS GPU output to benefit. And if it drops in and out of 144FPS and “288FPS” the experience might vary (if not there’s no point in using Fast Sync anyway).
Once that’s decided, you then need to find the best balance of shadows, AO, anti-aliasing and so-on that achieve the goal you’ve set. An overall “MEDIUM” setting may provide a good balance already then tweak a bit more.
(FRAPS can show FPS, but for most Steam games I have the in-game FPS counter set for Steam. It’s very small and not intrusive especially at 2560×1440 resolution)
Freesync only works BETWEEN the minimum and maximum range if the max to min ratio is less than 2.5x (i.e. 40Hz to 60Hz) but 40Hz to 144Hz works great. The solution to low FPS is through the drivers using LFC or “Low Framerate Compensation”. You can then drop below 40FPS in which case the GPU resends the last frame to stay in asynchronous mode (no added stutter or screen tear). So 39 true frames show as “78FPS” with FRAPS likely. SMOOTHER, tear-free experience which can also reduce ghosting a bit (see BlurBusters). Most importantly dropping in and out of asynchronous mode is JARRING (thus sometimes best NOT used if you don’t have LFC which many, possibly most? Freesync monitors do not support). At the other end, you may wish to CAP the upper frame rate. A couple games require a SPECIFIC FPS (i.e. 60FPS) to reduce physics or other issues. But… suppose you had a 30Hz to 75Hz Freesync monitor. You may want to cap to 74FPS? Apparently that is too close and may not work. So find the max of perhaps 70FPS, then TWEAK the game settings as needed. May even have MAXIMUM settings already.
The REASON to cap on the upper end is to prevent added STUTTER (VSYNC ON) or screen-tear (VSYNC OFF) by producing a frame rate above max which takes you out of asynchronous mode. At 144Hz however I’d use VSYNC ON thus screen-tear is off like normal asynchronous mode, but also the LAG/Latency is much less. Arguably not noticeable for most games though in shooters some people will be able to tell. POSSIBLY setting an FPS cap of say 130FPS for either FREESYNC or GSYNC would be ideal but you would have to experiment to see if it’s better AND that you are still in asynchronous mode.
I recently had a GTX680 and had to carefully tweak newer games. For some I used ADAPTIVE VSYNC which turns VSYNC OFF or ON automatically (OFF below 60FPS for 60Hz monitors). Aiming for 5% drops below 60FPS is a good start but it varies. You get screen tear instead of added STUTTER and increased lag/latency.
*Even then, some games have other issues:
– forcing a SLIGHTLY different frame rate (i.e. 61FPS) using NVInspector
– iFPSClamp=60 for Fallout 3/NV (and force dual-core fix for fallout 3 to prevent crash to desktop)
– HALF Adaptive/Dynamic (Dynamic for AMD) for 144Hz monitors to synch to 72FPS where applicable (i.e. bad screen tearing with VSYNC OFF, but can’t achieve 144FPS with VSYNC ON and doing so creates added stuttering)
– other various issues (PCGamingwiki has some fixes)
I’ve seriously had much better experience with a GTX680 than some have with a Titan X Pascal card in certain games. Especially older or newer less demanding games where I can achieve most or all of the visual fidelity but have a SMOOTHER experience than the better system due to proper tweaking and/or application of fixes.
ANY GAME should be able to run with a GTX1060 6GB, good CPU etc, and have an excellent experience. (with rare exceptions. some games have issues no matter what)
OTHER:
There are a surprising number of games that don’t run well or at least noticeably less than ideal without fixes or tweaking. A good rule I follow if I see stutter is to drop the settings really low to see how that affects game smoothness then try VSYNC ON and OFF.
OTHER:
Most games vary significantly the frame rate, so you may need to ADJUST the game settings later on if a large battle for example makes the game FPS plummet. Sometimes you can’t fix this at all.
OTHER:
Slightly off topic, but NVidia not properly utilizing DX12 is incorrect. There’s MORE to this answer but it’s too complex for this post, however one of the MAIN reasons for minimal or no gains with NVidia in DX12 is due to full, or nearly full usage of the GPU already whilst AMD’s GPU’s might be LESS utilized.
DX12 and Vulkan can minimize some of the AMD driver inefficiencies to better use the GPU. Again there is more to this but it’s absolutely a fact. (also, “GPU usage” can show about 100% yet not be 100% utilized due to how different areas of the GPU are used. AFAIK, you can have one section utilized whilst others are waiting, especially in DX11, but still report the GPU as being “100%” used. So it’s actually very DIFFICULT to determine how much processing power a GPU has remaining if the GAME ENGINE and VIDEO DRIVERS made full use of the GPU’s sections)
SUMMARY of setup EXAMPLE:
1. Set VSYNC OFF
2. Set rough settings to HIGHEST
3. Set VSYNC ON if screen tear is an issue.
4. Carefully adjust SETTINGS for optimal balance that achieve the goal (i.e. 60FPS 95% of time using Adaptive VSYNC)
5. Check for other tweaks, fixes or mods to enhance the experience. (if using MODS and Steam I make a backup of the game so the “Vanilla”, moc-free game can be easily restored)
Above second paragraph should
Above second paragraph should read for “any gamer” or similar. Not just people with asynchronous monitors. Again, I apologize as the LENGTH is far longer than initially intended.
To reflect the anger, let me just remind people that TRUMP is now President… (BTW in the Canadian Military the acronym is for a class of ship and stands for “Tribal Refit Update Modernization Program”).
MISTAKE to long
MISTAKE to long paragraph:
Should read that “FREESYNC does NOT work if the max/min ratio is below 2.5x.. sorry. I can’t EDIT the post.
So again, if it’s 30Hz to 75Hz that WAS the range only. AMD eventually fixed this with LFC so it works BELOW 30FPS (can have flicker or minor issues sometimes AFAIK.)
Over 2.5x (i.e. 40Hz to 144Hz) works almost identically to GSYNC, however there ARE difference sometimes such as OVERDRIVE pixel support which happens for both AMD and NVidia but AFAIK these type of issues happen more frequently with AMD’s Freesync (Freesync 2 tightens up the specification. It requires LFC. Not sure if anything else is a requirement though there may be things that AMD ends up unable to do without a specific module.
LAPTOPS using GSYNC don’t use a module. Some argue that this proves no necessity for the desktop. NOT. TRUE. With a laptop the screen is known so they can tweak the driver to optimize for the screen. Otherwise it’s very difficult as the varying response times cause SEVERE difficulty in keeping the color of pixels correct. You need to introduce OVERDRIVE (voltage) of varying levels according to the current FRAME TIME.
(I don’t know then if you can connect a MONITOR to a laptop for this reason. If not yet, I expect monitor manufacturers to work with NVidia to provide the specific details for their monitors like they already do for the basic details of refresh, response etc)
FREESYNC and GSYNC would work the same on laptop then, though I expect a GSYNC module eventually.
FREESYNC may end up with a module eventually too if some of the issues of 4K, HDR, high-refresh etc become too overwhelming to solve within the current limitations but only time will tell.
(on a side note, expect GSYNC/GSYNC 2 to eventually drop in price. With a KNOWN module, like game consoles, monitor manufacturers will reduce the price. Probably achieve PARITY to a non-GSYNC module or close to it as it REPLACES some parts so the HARDWARE and the SOFTWARE R&D especially with GSYNC 2 might be easier than with FREESYNC 2.
I hope to see both FREESYNC 2, and GSYNC 2 at some point including HDTV’s.
(I have a GTX1080, however it sounds like FREESYNC 2 is very seriously being considered for HDTV’s. I’d like a 4K, HDR, HDTV with FREESYNC 2 then I’d get an XBOX ONE or PS4 Pro for the living room.
The work to do this isn’t that significant. The main work would be on the HDTV side. I would suggest making it work from 30Hz to over 60Hz but enforce a CAP at 60FPS thus keeping the games in asynchronous mode but at a cap of 60FPS as that’s needed to match current consoles. Ideally, asynchronous mode would start BELOW 30FPS to avoid LFC as much as possible if the HDTV panel can be tweaked to allow that.
Probably that will come with LCD QD (Quantum Dot) and OLED HDTV’s. Perhaps 20Hz to 65FPS+ since high-refesh of 240Hz already exists it’s not much issue on the high-end, so the LOW END and ability to properly change COLOR vs FRAME TIME is the work to be done. Maybe it’s ALREADY POSSIBLE? My guess is yes.)
The performance of the GTX
The performance of the GTX 1060, the RX 480, and anything comparable is pretty well known by now. For that simple reason, I’m not reading this review. However, I have to make a comment, simply due to the “what the f*ck”-ness of that shroud design. What were Gigabyte’s designers thinking when they made that abomination? It looks like how a 10-year-old with limited drawing skills who’s only ever seen a few pictures of a graphics card would draw one. Wildly asymmetrical and unbalanced design? Check. Off-centre fans making the whole thing look lopsided? Check. Small-ish fans in a huge shroud? Check. Huge overhangs on both sides of the fans? Check (yet they still didn’t manage to make them symmetrical!). Half-hearted attempts at filling blank space with something to make it look less boring? Check. Haphazard and seemingly random angles combined all over the card? Check. Inexplicable cutouts in the shroud to reveal … the DVI port? Check.
I’ve never touched a CAD application in my life, and I’m willing to bet I’d be able to design a better looking shroud (for that specific PCB and GPU) in a day or two.
i have the same card on PC
i have the same card on PC and my stable OC on default voltage is 2114 MHz on core GPU and 9308 MHz on memory and i got like Max 60C in gaming. This card is very good for OC. And i got 2180/9800 MHz with voltage mV 100% and power target 11%. Killer OC 😀