On one side of the ring is the RX 480, with 2304 Stream Processors, 32 ROPs and 144 Texture Units. In the opposite corner, at 1280 CUDA Cores, 48 ROPs and 80 Texture Units is the GTX 1060. The two cards retail for between $200 to $250 depending on the features present on the card as well as any sales. [H]ard|OCP tested the two cards head to head, not just raw performance numbers but also the stability of the GPU frequencies. power draw and temperatures. All games were tested at base clocks and at the highest stable overclock and the results were back and forth, in some games AMD pulled ahead while in others NVIDIA was the clear winner. It is worth keeping in mind that these results do not include VR results.
"We take GIGABYTE’s Radeon RX 480 G1 GAMING video card and pit it against a MSI GeForce GTX 1060 GAMING X video card in today’s evaluation. We will overclock both video cards as high as possible and compare performance and find out what both video cards have to offer in the upper $200 price range for gaming."
Here are some more Graphics Card articles from around the web:
- Galax GTX 1070 EXOC Sniper Review @ OCC
- Zotac GeForce GTX 1050 @ Hardware Secrets
- Gigabyte GTX 1050 Ti G1 Gaming 4 GB @ techPowerUp
- MSI GTX 1050 Ti 4GB Gaming X 4G @ Kitguru
IF YOU CHOOSE THE SHITTY UGLY
IF YOU CHOOSE THE SHITTY UGLY LOUD HOT AND UNRELIABLE AF, VASTLY INFERIOR IN EVERY WAY, AMD PRODUCT, YOURE A FUCKING REE-REE AND DESERVE ALL THE HASSLES HEADACHES RMAS AND BULLSHIT YOURE GONNA GET. ONLY NOOBS, AND UNEDUCATED WOULD MAKE A STUPID CHOICE, AND GET THE INFERIOR AF GPU. PERIOD.
^^ someone voted S3 Graphics.
^^ someone voted S3 Graphics.
I voted for ARM, too, and all
I voted for ARM, too, and all I got was 3.5GB VRAM.
Make async compute great
Make async compute great again!
I ACTUALLY VOTED FOR MCAFEE.
I ACTUALLY VOTED FOR MCAFEE.
Damn you all! I am deleting
Damn you all! I am deleting these posts until Sweetcheeks McGlueSniffer gets bored and wanders off back to 4chan but this is too funny to remove.
You might notice what I am now doing to the ones that don't end up in the electrical wastebin, though.
I am going to go out on a
I am going to go out on a limb and say you are making the font italic?
by you giving that
by you giving that knucklehead airtime becuaes you think it’s funny is only going to encourage him.
Get rid of the anon accounts and this all pretty much goes away.
McAfee was unironically the
McAfee was unironically the best. He was the only one who stood up to the NSA, knew anything about technology, and actually understood and cared about freedom. I’d have voted for him as well if I remembered how to spell his name
“I voted for Matrox, too, and
"I voted for Matrox, too, and all I got was 3.5GB VRAM." Best comment on Pcper!
Can’t argue that … but I
Can't argue that … but I also have to change it.
What is a ree-ree by the way … apart from a horror movie sound?
Oh, oh… we not allowed the
Oh, oh… we not allowed the “T” word here?
Those of us who live in the
Those of us who live in the great northern country of CANADA enjoy our wood burning GPUs just fine, they may not have the best graphics but they do just fine.
Have a nice day, EH?
This is not a political site,
This is not a political site, this is a technology site. You want to argue politics there are plenty of other places to go. Any political mentions will be modified.
From the way the chap was
From the way the chap was talking I’m guessing you just need to add “tard” on the end and you have your answer
Should have used the same
Should have used the same version of both. No reason they couldnt have gotten an MSI Gaming X 480.
That would be best, since
That would be best, since there is a little scandal around G1 Gaming RX480, having half the cooling capacity comparing to GTX1060 G1 Gaming. There is a v2 RX480 G1 G.. though that I haven’t looked at.
sigh, one day PCPer will wake
sigh, one day PCPer will wake up and get rid of that shared Anon account.
Yes they will, and one day
Yes they will, and one day there my not be a shared anon account, but not here and not now!
P.S. one Day AMD will not be forced to rely on gamers to stay in business, as gamers are ruining the quality of graphics processing with their FPS obsession and their need for far too many ROPs relative to the numbers of shaders at the expence of computational power. AMD appears to be making some very nice Radeon Pro WX SKUs for graphics work, and accelorator work. The real revenues for AMD are with Zen and the Server/HPC/Workstation Zen/Radeon Pro GPU WX professional market Zen/Vega for the Pros.
I didn’t know that website
I didn’t know that website was still working.
I AM THE SHITTY UGLY LOUD HOT
I AM THE SHITTY UGLY LOUD HOT AND UNRELIABLE AF, VASTLY INFERIOR IN EVERY WAY … PERIOD.
FUCKIN RIGHT ON MAN!!!!!! I
FUCKIN RIGHT ON MAN!!!!!! I AM ANGRY BECAUSE PEOPLE ARE DOING THINGS, NOT NECESSARILY THINGS I DISAGREE WITH BUT THINGS IN GENERAL. PEOPLE SHOULD BE PUNISHED FOR DOING THINGS. WHY IS THAT DOG LOOKING AT ME? FUCK THAT DOG. RAM-BUS SHOULD STILL BE AROUND. WHO DO I NEED TO TALK TO TOO GET SOME FUCKING TANG IN THIS COUNTRY. DEATH TO ALL SQUIRRELS!!!!!!!!!!!! SEA MAMMALS SHOULD GO BACK WHERE THEY CAME FROM!!!!!!
EXCETERA!!!!!!!!!!!!!!!!!!!!!!
Haha. Nice job man.
Haha. Nice job man.
Nice to see articles
Nice to see articles highlight the almost non-existing performance difference in the cards that most people will be buying.
Been very happy with my Sapphire Nitro+ RX 480 so far.
Yeah in frame rate they are
Yeah in frame rate they are similar. However that 480 Nitro eats more electricity than a 1080 to get 1060 level performance. It consumes 82 watts more than a 1060 system which is around 40% more.
http://www.kitguru.net/components/graphic-cards/zardon/sapphire-rx-480-nitro-oc-4gb-8gb-review/30/
Here’s Tom’s take of efficiency 480 vs 1060.
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1060-pascal,4679-7.html
Not even close. However you do have the best red team card so far.
The RX 480 has way more SP FP
The RX 480 has way more SP FP Flops compared to the GTX 1060, and the RX 480 runs at lower clocks to get that higher SP FP Flops metric. So games alone can not be used for any accurate GPU workload efficiency numbers. There are other workloads besides gaming that get AMD’s GPUs sold as with any bitcoin mining using any new algorithms that are not yet implemented in ASIC form! Bitcoin miners will most likely go for AMD’s extra SP FP performance for the dollar until the ASICs can be made.
Gaming usage may not make use(Yet) of all that extra AMD compute in AMD’s consumer SKUs but the other use cases for AMD’s GPUs will take all that extra compute and use it all.
So its easy to see why the RX 480 uses the power it’s in that extra FP hardware/async hardware that is used by more than games for some workloads! And as more Vulkan and DX12 optimized games are released those gaming benchmarkes will have to be redone.
Got a nitro + 480 8gb for
Got a nitro + 480 8gb for 225$ just now. Can’t complain.
Don’t get the hate on AMD when they have future proofed their new cards really well.
AMD’s definition of future
AMD’s definition of future proofing doesn’t include VR obviously.
I think most of what you are
I think most of what you are talking about “doesn’t include VR obviously” steams from the common game engine used. I am in no making any excuses, but when a certain GPU manufacturer has it’s finger prints in the engine. You can expect alternate GPU choices to not run as well.
Sorry- sreams not steams
Sorry- sreams not steams
I assume you’re referring to
I assume you’re referring to Unreal Engine. Take a look at VR leaderboard of average of all VR content rated so far by the site. It might open hour eyes a little or will you still call the source material biased.
http://m.hardocp.com/article/2016/11/11/amd_nvidia_gpu_vr_perf_please_state_your_name/6#.WCzttk5OlpU
However even in games designed with AMD’s liquidVR, they still aren’t faster than Nvidia.
http://m.hardocp.com/article/2016/10/21/amd_nvidia_gpu_vr_performance_serious_sam_vrtlh/4#.WCzsX05OlpU
Most of VR isn’t even using technology Nvidia built into their cards yet. When companies start supporting it look for the gap to get wider.
And AMD fingerprints aren’t all over Microsoft’s directx 12 because of AMD graphics in Microsoft’s Xbox consoles. Yeah LMAO.
The future still holds for as long as there is a dx12 there will be dx11. It’s still needed for out of the box multi gpu support and easier to code for. I’m sure you don’t need a reminder who is better in that API.
Please State Your Name –
Please State Your Name – Unreal Engine 4
“However even in games designed with AMD’s liquidVR,”
http://m.hardocp.com/article/2016/10/21/amd_nvidia_gpu_vr_performance_serious_sam_vrtlh/4#.WC0W3eQo47J
IMO The average time frame times don’t look that bad (no dropped frames). Plus GTX 980ti and GTX 1070 are faster cards than the Fury X to start with. Close but not apples to apples. Is this game DX11, DX 12 or Vulcan?
“And AMD fingerprints aren’t all over Microsoft’s directx 12 because of AMD graphics in Microsoft’s Xbox consoles.”
So Microsoft should gimp their console with a NVidia biased API?
GTX 10XX series cards seem to fair very well with AMD’s finger prints. That said DX12 merely “helps to” even the overhead discrepancy between AMD and NVidia. It does not gimp NVidia. afaik there is no performance loss in DX12 for NVidia (baring unforeseen problems handling compute). Seems strange for someone to be doing damage control for the leader.
“Most of VR isn’t even using (their software) technology.” Unreal Engine 4 says hello.
“I’m sure you don’t need a reminder who is better in that API.” Your right I do not need a reminder of NVidias affiliation with Microsoft in DX11.
Got your facts a little
Got your facts a little wrong. Dx11 was also created with AMD in mind as they had the xbox360 console at the time. The only dx that favored Nvidia was dx9. Because they had the original Xbox. It’s why Nvidia beats AMD so much in Blizzard games who still uses dx9 for games. AMD had dominance in dx11 as Nvidia didn’t even have a card with tesselation out at release. AMD enjoyed many months of dominance before Nvidia put out a card that was fully compatible. Don’t hate Nvidia because they did dx11 better by designing better and AMD got lazy/cheap.
The AMD VR game is Serious Sam. That other link was to show VR leaderboard of all VR content up to this point.
You haven’t been around long or done any research. Dx12 sometimes even hits AMD with negative performance as well. Just not as much as Nvidia. Dx12 is much harder to code for and we’ll it often doesn’t work as well as one would think.
And yes Pascal cards do well because Nvidia upped the compute greatly because dx12 was going to utilize it more because AMD cards can do compute well.
Directx should be as vendor agnostic as possible supporting the features of both cards equally. I don’t worry because with Nvidia’s next generation of cards will do dx12 so well(if history repeats itself yet again). They will have AMD begging Microsoft for dx13.
Seems strange a lot of AMD fanboys are mindless followers posting the same old tired sh*t about Nvidia. I am paid by no one and don’t have a leader. I post to inform or to counter misinformation because I want to.
Why do you keep moving the
Why do you keep moving the goal posts?
“Got your facts a little wrong. Dx11 was also created with AMD in mind” Please correct me If I am wrong but Parts of DX 11 (11.1, 11.2) were created with AMD “in mind”, but not proprietary.
You are arguing 2 distantly at odds lines of thought.
1/ That AMD sucks because it does not do well with biased towards NVidia software/API’s/etc.
2/ It’s unfair to use software/API’s/etc that favor AMD.
” Don’t hate NVidia because they did dx11 better by designing better and AMD got lazy/cheap.”
1/ My NVidia cards run in a Win 7 environment. My AMD cards run in a Win 10 environment. It is what works best for each. NO hate there. AMD moved on from TeraScale (microarchitecture), I do not think it was lazy. Mistimed maybe.
You brought up consoles not me.
“The AMD VR game is Serious Sam. That other link was to show VR leaderboard of all VR content up to this point.”
Yes I easily caught that that inflection. But the majority of that list favors NVidia leaning software solutions.
“You haven’t been around long or done any research. Dx12 sometimes even hits AMD with negative performance as well. Just not as much as Nvidia. Dx12 is much harder to code for and we’ll it often doesn’t work as well as one would think.”
If 6 years isn’t long, then no. But research and observe; as best I can; yes I have. But do you see me questioning your back ground? No, our opinions clash though.
Note my words:”GTX 10XX series cards seem to fair very well with AMD’s finger prints.”
Yes the learning curve with DX12 is steep.
“And yes Pascal cards do well because Nvidia upped the compute greatly because dx12 was going to utilize it more because AMD cards can do compute well.”
Early on I caught lots of flack on compute threads for suggesting NVidia now works well with compute.
“Directx should be as vendor agnostic as possible supporting the features of both cards equally. I don’t worry because with Nvidia’s next generation of cards will do dx12 so well(if history repeats itself yet again). They will have AMD begging Microsoft for dx13.”
But it is not entirely agnostic. It already has some NVidia only (perhaps too strong of word) adaptations. DX 12.1
“Seems strange a lot of AMD fanboys are mindless followers posting the same old tired sh*t about Nvidia. I am paid by no one and don’t have a leader. I post to inform or to counter misinformation because I want to.”
I responded as I read and came across this last. No fanboy here so it kind of goes back to you.
If you did LMAO (fanboyish). Perhaps a good surgeon could reattach the lost part.
To continue. I’m walking away
To continue. I’m walking away from this as your drawing me down to your level. I’ll live happily with GPUs form both venders.
Later dude
So it’s Nvidia’s problem that
So it’s Nvidia’s problem that companies are making VR content that favors their cards and not AMD that is too busy pushing dx12. AMD is starting to add directx 12.1 support for rasterization as they added it in rx 480. A video card is more than just compute. It’s fine if that’s all you’re looking for.
Are you implying that Nvidia isn’t good in win 10. It still has backwards compatibility with dx11 and 9 as well.
It’s good that you have two systems to play whatever games are better for them.
Obviously you seem to favor AMD with your more modern system being theirs and the tone of your comments. That’s OK. If you want to call me a fanboy fine. IDC. I like and support their products but don’t lie and post any falsehoods as to the capability or lack of AMDs I’m not saying you did either.
A very nice GTX 1060 from
A very nice GTX 1060 from Gigabyte got my green paper. 😀
And good luck using more than
And good luck using more than one GTX 1060 using SLI! And I’ll bet that Nvidia will do some more gimping at the driver level to make the 1060 unable to perform well using any DX12/Vulkan Non CF/SLI milti-GP adaptor in these Graphics APIs also. Nvidia wants you to pay to play, and pay you will for Nvidia’s overpriced kit!