It has been several months and more than a few driver releases since Hardware Canucks last reviewed the RX 480 and GTX 1060, as well as the arrival of more games with at least some DX12 support. They decided to revisit the performance results of these two cards, both stock versions as well as factory overclocked models. They chose Sapphire's RX 480 8GB Nitro+ and EVGA's GTX 1060 6GB GAMING models to compare and the results show that the extra work those companies put into these GPUs paid off. They tested a mix of over a dozen games and their results are interesting, in far more cases than in their first look at these cards the RX 480 comes out the clear performance winner, however that performance comes at a high enough cost that the GTX 1060 shows better performance per dollar. Take a look at this revised review if these cards are appropriate for your budget.
"More than four months after the launch of NVIDIA's GTX 1060, we take another look at its performance against AMD's RX 480 8GB in more than a dozen games. The results of this one may surprise you…… "
Here are some more Graphics Card articles from around the web:
- XFX RX 480 GTR Black Edition @ eTeknix
- ASUS GTX 1050 Ti STRIX OC 4 GB @ techPowerUp
- Gigabyte GeForce GTX 1050 Ti G1 Gaming OC @ Guru of 3D
Well nothing new here. AMD
Well nothing new here. AMD cards age better. And Hardware Canucks is not even close to an “amd friendly” site. Except if they changed.
Also we can see that AMD’s involvement in the consoles and the push of Mantle is paying off. Games run better on AMD hardware and Nvidia’s drivers are having stability problems lately. What problems AMD was facing 5 years ago, when everyone was developing on an Intel+Nvidia system, now Nvidia is facing those problems.
And we have Kyle at Hardocp saying that AMD struck a deal with Intel to license them Radeon tech. GCN everywhere? Intel laptops with FreeSync?
AMD will never license GCN to
AMD will never license GCN to Intel there may be some video/other decoding IP but NO GCN to Intel. AMD will keep GCN and pair it with Zen CPU cores and goodbye to Intel’s crappy SOC graphics on Laptops. AMD’s Zen does not have to beat Intel’s latest CPU cores in the single core IPC metric! AMD just has to get close in IPC to Intel and AMD’s real value addition to any Zen APU is that GCN/Vega IP that will make for many laptop APU sales. Forget about Intel getting anything close to GCN from AMD, and GCN and Zen will put Intel on the outside in the laptop Zen/Vega APU/SOC market!
No more dog food graphics from Intel! So long Intel Smell You Later!
Both companies share a common
Both companies share a common goal. To throw Nvidia out of business. Nvidia is a real threat to both of those companies.
No slack for monopolies be
No slack for monopolies be they Nvidia, Intel, or M$! AMD should never deal with Intel for any graphics IP, tell Intel to F-OFF they should. Intel needs to be brought down low and kicked to the curb for the folks stomp them at the curb even more. Never has the laptop market been so subjected to such unfair market practices as it has under Intel’s nefarious plans, and AMD was not properly compensated by the courts for Intel’s illegal market tactics. Intel’s Ultrabook scheme has saddled the entire OEM laptop market with such crappy performance metrics in the name of thinness over functionality.
Intel is now getting its radio/modems kicked out of phones and more problems with Intel’s router chips in routers need to be addressed. Intel’s gimped down graphics dog food has made plant of laptops not fit for some graphics uses without needing some help from AMD(more affordable) or Nvidia. There needs to be some Linux OS laptop OEM products available without M$’s OS, Intel’s SOC, or Nvidia graphics for some monopoly free laptop computing!
AMD should never license Intel any IP! Never Ever to those Intel bastards!
It’s Zen/Vega with Intel on the outside and better AMD APU SKUs with real graphics and not that expensive Intel dog food!
edit: plant of laptops
to :
edit: plant of laptops
to : plenty of laptops
#BetterRed
#BetterRed
The issue with AMD is their
The issue with AMD is their timing is just off again for the real gamers.
First time gamers, sure AMD has a better solution right now for 1080p.
Noise & power wasn’t part of this review but i’m guessing AMD probably still is the better buy right now.
To bad this review doesn’t go into that in 2 years from now both of these cards will probably be obsolete. So you are looking at $125 per year of gaming.
Better option would have been buying 2 years ago at the $320 price range and getting 4 years out of your card – $80 per year of gaming.
whatever… i recommend an AMD GPU… you win… and that GPU is the PS4.
No only the 1060 will be
No only the 1060 will be obsolete. AMD allows crossfire and with 8GB this will keep them alive for quite some time. Nvidia has disabled crossfire for the 1060 for likely $$$ reasons.
Nvidia the great market
Nvidia the great market segementator for profits, with every little feature in the 1060 gimped to force the user to pay to play for a more expensive SKU. Monopolies do this, just look at Intel and their PCIe lane/other CPU/SOC product segmentation for $$$$! AMD is the better value with more SP FP T-Flops on the RX 480! 2 RX 480s have about the same SP FP T-Flops as a Titan X(Pascal) at less than half of the Titan X’s Titanic cost! Just look at those RX 480s on the bit coin mining rigs by the dozens!
But NVIDIA isn’t really even
But NVIDIA isn’t really even close to a monopoly, they just have amazing PR. Intel IGP owns the integrated market share, AMD owns the console market and is making big strides in the AIB market, while NVIDIA’s control of the AIB marketing is declining, but their server compute is growing.
They’re really all strong in some areas and weak in others.
By market share Nvidia in a
By market share Nvidia in a monopoly, and no marketing department is that smart, it’s just that the bog standard gaming git is just that gullible! And GPUs are not all about gaming they are about compute just as much as gaming.
And now there are the DX12/Vulkan graphics APIs that also can issue compute workloads to the GPU and Nvidia lacks the total SP FP T-Flops on their mainstream SKUs with AMD’s GCN/Polaris RX 480 having 5.8 SP FP T-Flops of GPU compute available for both graphics compute and non graphics compute for games makers to make use of. And the RX 480 puts out that 5.8 SP FP T-Flops all while being clocked much lower than Nvidia’s mainstream competition. Nvidia does not have that async-compute scheduling and management fully implement in its GPUs’ hardware, so latencies will be higher and software will never be as quick to respond to dynamic workloads as hardware is.
AMD already has some server/HPC/Workstation wins for its Radeon Pro WX SKUs and that market will take all the 5.8 SP FP T-Flops of GPU compute and more from the other Radeon Pro WX SKUs including one with its own SSD storage on the PCIe GPU card! That Radeon Pro WX SKU has access to its own SSD storage with no extra PCIe encoding/decoding hops required for quick data/texture access more directly from the on board SSD.
The only area will AMD be weak relative to Intel when the Zen and Zen/Vega SKUs are to market is the lack of the Intel style funds to like Intel uses bribe laptop OEM’s or lead the entire OEM laptop market towards an Apple Style form over function Ultrabook thin and useless market of weak laptop SKUs that have thermally throttle their discrete GPU and SOCs. Intel has be judged by the courts to be an abusive market monopoly interest and should have been forced to pay even more damages! So once the Zen/Vega APUs are to market Intel and the laptop OEMs need to be closely monitored lest it will all happen again!
The AMD crossfire solution
The AMD crossfire solution has Microstudder or better defined inconsistent frame rates. I own a 480rx and a crossfire motherboard but after doing research all never put a second one in
I disagree with your
I disagree with your logic.
Midrange cards go “out of date” at the same rate as high end cards. If one person buys a card and expects it to achieve 60FPS at 1080p at high settings, and another person buys a card and expects it to achieve 60fps at 1440p at ultra, they will both lose their ability to hit those framerates at roughly the same rate.
Said another way; if I bought a GTX 660ti to play at 1080p in 2012, and you bought a GTX 680 to play at 1440p, our degradation of experience over time (running new games) would be roughly the same.
I agree with you in terms of
I agree with you in terms of your explanation. However, you are missing that there are development cycles that dictate advances. I was specifically talking about 2014. I could be wrong as i don’t have a crystal ball, however, rumors suggest by 2018 we are going to need a lot more horsepower.
This continues to confirm
This continues to confirm what people have said about AMD cards. Like my trusty 7970 crossfire they get better and better with time. This may lend to the perception that Nvidia cards get slower as soon as a newer card comes out.
In addition Nvidia still has problems in the DirectX 12 department and no async shaders which at the time last year they said was just a driver fix which turned out to not be true.
For those needing a TLDR basically the RX480 cleans house.
Quote: DX12 turns the GTX 1060 over onto its head and while things are still relatively close, the RX 480 has nonetheless managed to extend its lead. AMD’s budget-focused GPU wins in almost every single game with gaps ranging from moderate (Warhammer) to almost embarrassing (Quantum Break). The lone exception to this is Gears of War where NVIDIA does put in an impressive showing but that does nothing to stop the avalanche of losses from piling onto the GTX 1060’s shoulders. “
Speaking of Quantum Break
Speaking of Quantum Break what is embarrassing is the D3D12 implementation, in D3D11 NVIDIA cards are faster than AMD’s one in D3D12
I’ve also heard some murmurs
I’ve also heard some murmurs about the recent rx480’s performing much better that the original batches. Lower temps, more stable clocks and much less power.
However, if these were true, wouldn’t AMD relaunch the card as the rx485 or something. Marketing wise I don’t understand the “silently improving the process” strategy.
I’m curious tho, could be true
That always happens as the
That always happens as the fab process nodes are improved! Nothing new there as there will be more as the 14nm GF/Samsung process gets more mature and the performance 14nm process tweaks come online! But GCN is getting better as the DX12/Vulkan APIs’ and the games/gaming engine software ecosystem takes over for gaming, more improvements are coming every day! OLD GCN hardware and the latest GCN hardware is showing more improvement with each passing week!
I’ve noticed that years
I’ve noticed that years ago.
For me when I upgrade, I always wait for at least 6 months after a new card launch before upgrading.
The CPU used was an Intel
The CPU used was an Intel 5960x. You’re never going to find that in a budget system. Everyone knows Nvidia performs better than AMD when the CPU is weaker.
Also why no mention of performance per watt. That is the only metric that should matter. Price is changeable by rebates,demand and other factors. Wattage used doesn’t change much unless there are improvements to the card itself which is a controlled factor.
And thirdly taken directly from said Hardware Canucks article about dx12 where AMD’s only real advantage is in. “It does bear mentioning that DX12’s benefits right now are anything but tangible and in many (but not all) games where it’s included the API doesn’t offer any more real-world performance than DX11.”
In other words they basically only win in a designed for them custom API of dx12. There isn’t much if any benefit to using dx12 over dx11 in the majority of games so far. Wow big surprise there. It might improve over time but by the time that becomes relevant there will be new cards out.
Nvidia’s “mainstream” GPU
Nvidia’s “mainstream” GPU SKUs so stripped of FP performance by team greed. With Nvidia ready to charge $$$$$ to put that FP flops back! Just look at what the bit-coin miners use and DX12/Vulkan will make use of AMD’s async-compute and the extra SP FP T-Flops that is fully managed in the GCN GPU hardware.
For AMD the DX12/Vulkan change over is happening with some new improved RX 480 performance metrics and more improvements to come as more games start to make use of that RX 480’s 5.8 T-flops of SP FP performance for gaming compute acceleration as well as graphics done on AMD’s GPUs! So put 2 RX 480s in Multi-GPU configurations using the DX12/Vulkan APIs or CF and that adds up to around the same amount of total SP FP T-Flops as a Titan X(Pascal) for one hell of a lot less cost when using AMD’s RX 480. The GXT 1060 is Gimped of FP Flops and can not use SLI, and strangely there appears to not be much info on and GTX 1060’s DX12/Vulkan multi-GPU figures. What is Nvidia up to with that!
DX12 and Vulkan are not custom APIs they are the new graphics APIs that all games will be using! So where is Nvidia’s mainstream GPU hardware that can make use of the new graphics APIs!
I care about gaming
I care about gaming performance and not fp performance. If I was a miner it might be different.
Nvidia was straightforward with saying no SLI for 1060. SLI/crossfire support is lacking in dx12 and has to be coded in by the programmers.
Dx11 is going to be around just as long as dx12 and maybe even longer. We still even have dx9 support and games as of today.
Enjoy your old gaming
Enjoy your old gaming experience, the new games and the new graphics APIs will be doing more non graphics gaming compute on AMD RX/Polaris SKU’s along side the the graphics compute with less latency while Nvidia’s current and past few generations of old GPU SKUs fall further behind.
Anonymous Nvidia User is actually being used Anonymously by Nvidia as a fool! Enjoy your lack of SLI, and where are the DX12/Vulkan managed Multi-GPU benchmarks for any GTX 1060/1050 SKUs(?), is Nvidia gimping that also!
I never had a need for SLI
I never had a need for SLI anyways. I’m in the 99% instead of the 1% that use multiple cards.
I have faith that Nvidia will design cards with whatever features are needed to be the top performers.
The only fool is you team Red guys who are being taken for a ride because AMD needs all the helps it can get to match Nvidia’s raw horsepower. I don’t think AMD will match their efficiency per watt either.
It’s the total SP FP
It’s the total SP FP T-flops/dollar that the RX 480 provides for more that just gaming usage that makes AMD’s GPUs popular. With more SP FP T-Flops AMD’s RX 480(5.8 SP FP T-Flops) is great for compute usage and at much more affordable price, and that includes Ray Tracing acceleration done on the RX 480 for non gaming graphics usage. It’s not all only about some crappy gaming graphics spaffed out at high FPS(Nvidia’s main focus) for AMD’s consumer GPUs it’s about some really realistic single image wotkloads that are very realsitic for non gaming graphics usage workloads at way more SP FP T-Flops per dollars spent. 2 RX 480s have about the same SP FP T-Flops performance as a Titan X(Pascal) for around 1/3 the cost.
Nvidia SP FP T-Flops cost about twice as much if not more than what thay cost using AMD’s RX 480 and RX 470 SKUs! Just go look at the bit-coin mining rigs and see the RX 480s in great numbers doing way more compute for the dollar. It’s not all about gaming workloads for AMD’s consumer SKUs its about other uses that Nvidia’s costly options can not provide without breaking the bank!
I run a RX480 with a 4770K at
I run a RX480 with a 4770K at 4.4GHz. I don’t think that is an unusual setup; and I don’t think that 5960x is pushing lower CPU frame times than my processor since most games don’t utilize the extra cores.
“That is the only metric that should matter.” For you, I recommend a smartphone. They have really impressive “performance per watt.”
On your third point, I disagree; but regardless, by your logic, if I keep my RX 480 past when “there will be new cards out” I’ll see the benefits.
Yes the 4770k is a good
Yes the 4770k is a good gaming processor and I also have one in my system. However it is only a $330 processor and not a $1000+.
You can shove your smartphone recommendation up your red ass.
You disagree that both AMD and Nvidia cards suffer negative performance in dx12 compared to dx11 in most games. Do you not read any reviews that compare dx11 vs dx12. Dx12 is a worse flop than dx10 which was unsupported after a short three years or so.
“The CPU used was an Intel
“The CPU used was an Intel 5960x. You’re never going to find that in a budget system. Everyone knows Nvidia performs better than AMD when the CPU is weaker.”
Fud post of the week LOLOLOL
Thats right folks you want the worst CPU you can find when you get an Nvidia because its faster.
What kind of processors do
What kind of processors do you expect will be in a mainstream or budget system. The worst processors are made by AMD. LOL
You can get a used 5960x for about $820. New they run from $1000-$1300. I don’t think that it would make it in any mainstream or budget system.
Well Zen may change that and
Well Zen may change that and more so for any Zen/Vega APUs. No one expects Zen to outright beat Intel’s latest in the single core IPC metric! AMD’s Zen only needs to get close and AMD’s graphics will be what gets the wins at an affordable price point also. Intel only uses it “best” graphics for show because Intel will never offer its best graphics on its lower cost SOC SKUs. AMD will win that Price/performance metric for more affordable mainstream gaming.
Some of AMD current non Zen SKUs do not perform so poorly when the price is compared to what Intel Offers in the same price range! And there are some very affordable options that do not break the bank like Nvidia and Intel mostly do for their non affordable gaming systems.
They used the top end CPU to
They used the top end CPU to REMOVE any possible cpu bottlenecks, this is a common thing done by many reviewers. Why diddn’t you mention that?
But by your comment, you are suggesting that a lower end cpu should be used for testing becuase it gives Nvidia cards an advantage? LOL WUT? Nvidia User indeed!
Yes this is what they say and
Yes this is what they say and while it has some merit; it makes someone with a weaker processor think they will get the same performance. Sadly they will not. So they may not actually be buying the “best” card for their system specs. Be it Nvidia or AMD.
When Furyx first launched Techpowerup was listed as one of the unfriendly to AMD sites (didn’t receive a review card at launch). This was probably because they used a 4770k in their reviews. The Furyx needed a beefier processor to look more impressive. Since then they added a extreme Intel processor in their test bed and other things to be extremely AMD friendly.
“When Furyx first launched
“When Furyx first launched Techpowerup was listed as one of the unfriendly to AMD sites (didn’t receive a review card at launch).”
That GPU review samples nonsence should have recieved government watch dog attention and the government/FTC should have done their job and forced a random lottery on the GPU review sample selection process! The GPU makers should not be allowed to control who gets the samples in the review press, that process needs to be reformed it’s currently a very non objective and non fair selection process!
This is awesome, and add in
This is awesome, and add in the extra performance with the new drivers coming out and the 480 will be ever further ahead!
Moving forward is indeed #BetterRed
lov’in ma XFX rx480 gtr
lov’in ma XFX rx480 gtr black.. keep it AMD
^keep it up AMD 🙂
^keep it up AMD 🙂
Performance/Watt,
Performance/Watt, Performance/Dollar, Noise and heat only get brought up by Nvidia users when they are losing the FPS war.
The RX480 is improving through better yields and better drivers. Not so long ago the 980Ti was king of the roost (and rightly so). Now all of a sudden it seems to be having performance issues on latest AAA games and problems…..co-incidence? I think not.
Fingers crossed that Vega hopefully makes a big impact in 2017. Admittedly we have been here before though 😛
The same things that were
The same things that were brought up by AMD fanboys when comparing the 400/500 series of cards to the AMD cards of the time.
Vega ought to make a splash as it will be coming almost a year later than Nvidia’s offerings.
Dx12 doesn’t offer any graphical differences over dx11. There is little reason to run dx12 if you get more fps in dx11 than in dx12. This is the majority so far unfortunately.
Anyways who is losing the fps War. I see rx480 but it cannot hold up to the 1070,1080, or Titanx Pascal. Dx12 or not.
chizow, you need a time out
chizow, you need a time out in the corner for all of the misinformation you’ve been spewing.
That’s chizow he’s mostly MIA
That’s chizow he’s mostly MIA at WCCFheck these days, and what happened to N7 Sphincter Elite! How’s that Jug band you got with old N7 at the doublewide! They are missing you at the WCCFheck primate house feces slinging fest, you were the one with the most feces in flight, both day and night!
You both are wrong. I think
You both are wrong. I think Chizow wouldn’t be shy about using his name. A lot of his posts were hilarious as well as the extreme AMD fanboys as well. May I assume you are one of those as well.
There are way more AMD fanboys on any site flinging more misinformation. How’s your AMD premium VR at this moment? How’s your frame time with those AMDs?
There are lots of factors into what makes a card better than simple fps number. The majority of the discrete market gets this. If you are not part of this group then I guess you probably don’t get it either.
Not an AMD fan as much as an
Not an AMD fan as much as an AMD technology fan, HBM, async-compute, and interposer and GPU technology, but AMD is just a F-ing dirty parts supplier to the OEM’s and AMD being down and out and innovating to stay alive is just great for the OEM PC/Laptop market! And Intel and Nvidia need to be brought down low like any AMD/Other dirty parts supplier scum! Intel and Nvidia have too much control and market share for any damn dirty suppliers of CPU/SOC/GPU/APU parts to have over any OEM markets!
IBM had the right Idea back in the day to keep Intel in its dirty filthy parts supplier place by forcing Intel to cross license the x86 16 bit ISA with AMD/others at the dawn of the PC era. IBM knew how to keep a dirty hardware parts supplier in line and not get IBM between its CPU parts supplies and a hard place. IBM though really messed up with the OS part of the PC equation in contracting only with M$ and not sourcing its PC OS from 2 or more suppliers!
Most of the bog standard gaming gits of today do not understand what a real healthy supplier market dynamic is and just what a sycophant market the PC/laptop market has been for 30+ odd years now with that single dirty Intel SOC parts supplier in control of too much of the CPU/SOC supply market to the PC/laptop OEMs, ditto for M$ in that respect. That’s going to change somewhat with Zen, but the justice department is going to have to keep a close eye on that convicted monopoly market abuser Intel!
Chizow is even below the bog standard gaming git in IQ, he is a big monopoly fan, and a feckless fool! Parts suppliers are not football teams as there is high technology involved. So props go out to the engineers and the PHD’s but the gaming gits/fanboys really are deplorables and a threat to democracy!
All parts suppliers are damn and dirty and can never be trusted, and even the OEM’s need to be carefully watched if they get too much market share!
Probably your small brain
Probably your small brain cannot figure out that:
In dx11 say that you need X resources from the gpu to get 60fps at 1080.
In dx12, due to less overhead, better api/less syscalls, let’s say that you get a 10% uplift in fps and you can get 66 fps at 1080.
The guy that makes the gamd now has 6 frames to spare, this means that he can split that time into the following sections -> more complex objects, more filters, more complex AI, greater resolution or even more complex cryptography if we are talking about multiplayer games.
So, With the use of a faster api you _get_ better visuals, better ai, better everytbing. That is up to the developer on where that extra time that he gets from the better api will be invested into in order to give you a better game.
What you see in the benchmarks is that with the same visuals you get more performance out of dx12. If the developer wanted to show you how dx12 can be better, he would make 2 games, 1 in dx11 and one in dx12 with better renderer and ai e.t.c. than the first while having the same fps as the dx11 version
Ideally that is what dx12 is
Ideally that is what dx12 is supposed to do. Reality shows a different thing. Both camps are usually getting lower frames with dx12 version compared to dx11 version.
Dx12 requires more programming than dx11 as it gives them more control over video card functions but that means developer has to do more work and spend more time and money to make the dx12 version.
Maybe as programmers become more proficient and familiar with dx12 it may change but it’s been almost a year and a half and still little success with it.
Win10 is one of the biggest reasons against having dx12. If they allow win8.1/7 to have it, it may help the adoption rate improve. After all they made dx11 available on Vista when they said they wouldn’t/couldn’t do it.
Enjoy your dx12 version because no matter what I say you are going to ignore it anyway. Closed minds are way worse than small minds.
Clear sign that Nvidia
Clear sign that Nvidia marketing budget run out before the end of the year. On the other hand it is noticeable that AMD marketing is more active lately.
480 and 1060 close performance really gives reviewers nice opportunity to earn a buck.