What is old is new again
AMD is back with some more Polaris.
Trust me on this one – AMD is aware that launching the RX 500-series of graphics cards, including the RX 580 we are reviewing today, is an uphill battle. Besides battling the sounds on the hills that whisper “reeebbrraannndd” AMD needs to work with its own board partners to offer up total solutions that compete well with NVIDIA’s stronghold on the majority of the market. Just putting out the Radeon RX 580 and RX 570 cards with same coolers and specs as the RX 400-series would be a recipe for ridicule. AMD is aware and is being surprisingly proactive in its story telling the consumer and the media.
- If you already own a Radeon RX 400-series card, the RX 500-series is not expected to be an upgrade path for you.
- The Radeon RX 500-series is NOT based on Vega. Polaris here everyone.
- Target users are those with Radeon R9 380 class cards and older – Polaris is still meant as an upgrade for that very large user base.
The story that is being told is compelling; more than you might expect. With more than 500 million gamers using graphics cards two years or older, based on Steam survey data, there is a HUGE audience that would benefit from an RX 580 graphics card upgrade. Older cards may lack support for FreeSync, HDR, higher refresh rate HDMI output and hardware encode/decode support for 4K resolution content. And while the GeForce GTX 1060 family would also meet that criteria, AMD wants to make the case that the Radeon family is the way to go.
The Radeon RX 500-series is based on the same Polaris architecture as the RX 400-series, though AMD would tell us that the technology has been refined since initial launch. More time with the 14nm FinFET process technology has given the fab facility, and AMD, some opportunities to refine. This gives the new GPUs the ability to scale to higher clocks than they could before (though not without the cost of additional power draw). AMD has tweaked multi-monitor efficiency modes, allowing idle power consumption to drop a handful of watts thanks to a tweaked pixel clock.
Maybe the most substantial change with this RX 580 release is the unleashing of any kind of power consumption constraints for the board partners. The Radeon RX 480 launch was marred with issues surrounding the amount of power AMD claimed the boards would use compared to how much they DID use. This time around, all RX 580 graphics cards will ship with AT LEAST an 8-pin power connector, opening overclocked models to use as much as 225 watts. Some cards will have an 8+6-pin configuration to go even higher. Considering the RX 480 launched with a supposed 150 watt TDP (that it never lived up to), that’s quite an increase.
AMD is hoping to convince gamers that Radeon Chill is a good solution to help some specific instances of excessive power draw. Recent drivers have added support for games like League of Legends and DOTA 2, adding to The Witcher 3, Dues Ex: Mankind Divided and more. I will freely admit that while the technology behind Chill sounds impressive, I don’t have the experience with it yet to claim or counterclaim its supposed advantages…without sacrificing user experience.
Finally, though we are focusing on the Radeon RX 580 8GB today, AMD is in fact launching an entire RX 500-series family. The Radeon RX 570, RX 560 and RX 550 will round out a collection of products that scales from $279 (overclocked RX 580 models) down to the $79 of the RX 550.
While we plan to have a review of the RX 570 on the site soon, we’ll gauge user interest in the RX 560 and RX 550 before getting any hardware in. The RX 550 looks particularly interesting as it is the only new GPU in the mix – a smaller chip designed specifically for IGP replacement.
The Radeon RX 580
Let’s talk about the Radeon RX 580 under the microscope today. It will be available in both 8GB and 4GB models, with the 4GB models starting at $199 and the 8GB models starting at $229. To be frank, based on the limited availability of the RX 480 4GB options, I would expect the same limited coverage of the 4GB on the RX 580. This is basically built to make sure AMD can claim they have a $199 graphics solution.
It’s also worth noting that though AMD does have partners building reference-esque RX 580 8GB cards at the $229 price point, I do not expect those to be the most common option. Instead, companies like ASUS and MSI are going to speed their allocation on their own custom coolers, added power draw and higher clocks. The MSI RX 580 Gaming X 8GB that AMD sent us for testing will have an MSRP of $245 – more in line with where I expect the RX 580 family to sit.
That will be an important distinction as we go to talk about the competing NVIDIA cards in the GTX 1060 family.
RX 580 | RX 480 | GTX 1060 | |
---|---|---|---|
GPU | Polaris 20 | Polaris 10 | GP106 |
GPU Cores | 2304 | 2304 | 1280 |
Rated Clock | 1340 MHz | 1266 MHz | 1506 MHz Base 1607 MHz Boost |
Texture Units | 144 | 144 | 80 |
ROP Units | 32 | 32 | 48 |
Memory | 4GB 8GB |
4GB 8GB |
6GB |
Memory Clock | 8000 MHz | 7000 MHz 8000 MHz |
8000 MHz |
Memory Interface | 256-bit | 256-bit | 192-bit |
Memory Bandwidth | 256 GB/s | 224 GB/s 256 GB/s |
192 GB/s |
TDP | 185 watts | 150 watts | 120 watts |
Peak Compute | 6.1 TFLOPS | 5.8 TFLOPS | 3.85 TFLOPS (Base) 4.1 TFLOPS (Boost) |
Transistor Count | 5.7B | 5.7B | 4.4B |
Process Tech | 14nm | 14nm | 16nm |
MSRP (current) | $199 (4GB) $239 (8GB) |
$199 | $249 |
The RX 580 has the same number of compute units, stream processors and the same size memory bus as the Radeon RX 480. The only changes are the higher rated clock speeds and the higher starting TDP. On the clock speed subject, there is still a disconnect between how AMD and NVIDIA represent clock speeds to consumers. NVIDA is very upfront that the “base clock” its GPUs are speeds that the silicon should never drop below in normal gaming scenarios. The “boost clock” is a typical clock speed you should see in real-world gaming and in fact, in nearly all our tested scenarios, the NVIDIA partner cards and reference card exceed that.
On the contrary, AMD rates the GPUs at a single “boost clock” that represents a more-or-less maximum clock speed that you would see on the card in an ideal situation. They will freely admit that hitting or sustaining that clock speed is rare and would the user will find cards floating somewhere below that. With this particular release, AMD is also showing us a “base clock” but it does not represent a minimum clock speed that the GPU will guarantee to run at. Instead, the base clock here is more of a “bad case typical” – it’s all very confusing I admit. In general I think that NVIDIA has the right methodology here to avoid confusion and complaints from consumers on what clocks are expected and what are delivered. Under promise and over deliver is a great mindset here.
The other significant change is on power consumption. I already mentioned that while the RX 480 claimed a TDP of 150 watts, in practice the card did not operate at it. AMD stretched things beyond spec and reference cards would easily pull 180 watts without overclocking all while only requiring a single 6-pin power connection on the card. For the Radeon RX 580 launch AMD is alleviating us of that concern by simply opening the floodgates. The reference TDP of the RX 580 is 185 watts though you will see board partners going well beyond that. Our tested MSI card today pulls just over 200 watts. Thanks to the move to an 8-pin power connector we no longer have any system safety/stability concerns to deal with, but it does mean that any semblance of performance/watt comparisons to NVIDIA’s GTX 1060 should go out the window.
The Competitors
For our review today we are looking at a set of three overclocked cards: one RX 580, one RX 480 and one GeForce GTX 1060 from EVGA. I will be the first to tell you that 99 times out of 100 I would prefer to test reference models of cards first but for this launch, none were made available. That means that in order to be as fair as possible, all the cards in our set needed to be overclocked and in about the same amount relative to their base models.
First on the block is the MSI Radeon RX 580 8GB Gaming X model with a $245 MSRP. This card has a rated clock speed of 1393 MHz, an increase over the 1340 MHz rated by AMD for the RX 580. Memory is also slightly overclocked at 8100 MHz, compared to 8000 MHz reference. This card uses MSI’s custom cooler design with a pair of fans that can run in a silent mode when running at Windows idle state.
Next up is the ASUS RX 480 Strix 8GB card currently selling for $239 on Amazon.com. This card also uses a custom cooler and has an overclocked out of the box speed of 1330 MHz, an increase over the 1266 MHz rated clock.
Finally we have the EVGA GeForce GTX 1060 6GB SC model currently selling for $249 on Amazon.com. This card is also overclocked, with a base clock of 1607 MHz and a boost clock of 1835 MHz, an increase over the reference speeds of 1506 MHz and 1608 MHz. The EVGA card is a bit higher overclock compared to the AMD cards, but it does represent the lowest overclocked option from EVGA on the 1060 6GB card family, with a couple of options going well above. And with the pricing at its current state, these comparisons are spot-on.
Well I mean, at least 580
Well I mean, at least 580 seems like a decent deal compared to 1060.
It’s definitely not a BAD
It's definitely not a BAD deal, but I was kind of hoping AMD would under cut the RX 480 pricing enough to put more pressure on the NVIDA GTX 1060.
Once again I didn’t wait long
Once again I didn’t wait long enough to buy a new GPU. While the 1060 I have from EVGA is performing well enough, I could have saved a little money and gotten a card that runs a few percent better.(albeit with higher power draw but I digress)
Ahh the joys of PC building.
Very true, very true…
Very true, very true…
file:///C:/Users/blpri/Pictur
file:///C:/Users/blpri/Pictures/Saved%20Pictures/Unigine_Heaven_Benchmark_4.0_20180313_1929.html
That’s about what I expected
That’s about what I expected from the 580. I was wondering if their new LPP or whatever would be a little more power efficient, but it seems like it just supports higher clockrates. My old PC has SLI 770GTX’s which each pull up to 250W each under load. Those are basically rebranded and overclocked 680GTX’s. The more things change the more they stay the same?
It’s a shame there are no Vulkan benchmarks though.
About game selection, its
About game selection, its another case where you can pick the game test so you can write the conclusion you want.
Worse offender is TechReport.
Contrast Techreport with hardwarecanucks.
One site used 3 games, most heavily favoring the GTX.
The other 14 games each at multiple resolution….
Techreport lost all its credibility long ago 🙁
I’m not going to be overly
I'm not going to be overly critical of anyone, but testing takes time, especially if you do it correctly. For example, I still ONLY test GPUs with hardware-based capture systems called Frame Rating, and what NVIDIA calls FCAT. It's more work, takes longer, but gives us much more accurate results.
The only disappointing thing is when your testing finds no variation…then all the work proves that everything works fine. 🙂
Counting frame times does
Counting frame times does find problems with individual games better, but I still find myself hunting down the average framerate as an easy way to compare hardware.
I only pay close attention to the frame times if I happen to be interested in actually playing the game being benchmarked. Funnily enough, I don’t. These days, it’s mostly Overwatch, but everyone seems to think that its hardware requirements are too low to make a good benchmark. I do wonder why Rise of the Tomb Raider always has such low framerates, even though it’s a console game port. But then again, I don’t play that game, so the numbers don’t mean much to me. As long as half the games do a few % better, and the other half only do a few % worse, I figure they’re about equal. Still, if Vulkan ever catches on, the Radeon cards do put up some really good numbers in Doom. I wonder if that advantage will also carry over into Quake Champions.
I would like to know their relative strengths in compute though. It isn’t a good measure of gaming prowess, but if I ever get the urge to fold some proteins, I’d like to know if I should even bother. Judging by the TFLOPS, the Radeon cards should be pretty decent.
I still applaud the effort
I still applaud the effort you and your team put in for Frame Rating. Concept, debugging, and getting it to work on multiple platforms must have been a huge task. Now you’ve moved on to actually measuring power draw! What other sites are doing that? Not many. That extra effort will always bring my page views and clicks.
i have to agree with you on
i have to agree with you on this.
Well it looks like I might
Well it looks like I might need to go team green next build. I was hoping for power consumption optimization but it does not look like it.
Hard choice being an AMD fan (Not a crazy fanboy)
Ryzen 1600 + Nvidia 1060 + 16GB Ram.
Humm that’s 666 I guess it could be my Evil computer.
Unless Vega has some magical TDP for a cheap price!.
Check this review. Also, I’m
Check this review. Also, I’m not sure the GTX 1060 will age that well.
And unless you are bitmining 24/7, at idle/normal load it seem both card are equivalent.
By that I mean the extra 20 watts or so during gaming should be “invisible” in your power bill.
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/75127-amd-rx-580-8gb-performance-review-20.html
I was hoping for a more power
I was hoping for a more power efficient card at a more considerable discount. Like 20W less power draw and $200 for the 8GB and $160 for the 4GB.
This is disappointing even for a rebrand.
I don’t understand why you
I don’t understand why you care about 185w of power draw. You will save like $2.00 in electricity over the life of the card by saving 20w. As long as it has adequate cooling at an acceptable noise level power consumption isn’t that important.
For me the power draw matter
For me the power draw matter because I want a quiet system.
Power = More Heat = Louder system.
This is completely false. As
This is completely false. As long as the cooling capacity of the card is equal or greater than the TDP of the part a quiet system can be achieved. Yes there are practical limits to this as you aren’t going to have a 20 pound heatsink on a card or fans the size of a dinner plate but there is no technical reason why a part with a TDP of <200W can't be quiet.
Anything can be quiet, but
Anything can be quiet, but – given the same cooler designed for dissipating ~200- a card with 80W lower TDP is going to be a whole lot quieter, especially if the PC is placed somewhere with limited airflow, like is the case with mine.
Are you serious?
More W means
Are you serious?
More W means more heat in the room. While it is nice in Winter, it is not in Summer.
Also, given the fixed size of the VGA cards (they can’t be as tall as some CPU heatsinks) higher power consumption roughly translate in higher fan rpms and then noise.
NVDIA GTX1060s consume so little power they often stop spinning their fans. Even while playing games.
RX480/RX580 do not.
http://www.anandtech.com/show
http://www.anandtech.com/show/11278/amd-radeon-rx-580-rx-570-review/16
Well, I have to eat my hat
Well, I have to eat my hat for that.
Blower style cards are
Blower style cards are inherently noisier than open air designs.
Power = More Heat = Louder
Power = More Heat = Louder system.
This, I completely agree. It is especially true for people who like to game in the living room and thus use tiny systems because they don’t want their living room looking like a student room. Because of this I’ve went the nvidia way since Hawaii.
Now my eye is back on AMD because I really hope some TV manufacturer will make a free sync compatible TV soon. My guess is that this will happen before there will be a Gsync TV. Mainly because free sync will be available in the neo and scorpio.
This usage case (small form
This usage case (small form factor cases) is pretty much the only reasonable argument in my opinion for power consumption being a primary concern when buying a card (within reason).
I will be the first to admit
I will be the first to admit that I am an Nvidia user, but obviously I still want AMD to succeed and bring some competition to the table to push the market ahead, but this is another disappointing product launch.
I pretty much mirror Lucidor’s comments.
It’s surprising how a game
It’s surprising how a game selection can influence the conclusion of a review. In other reviews, the RX580 has a 8% lead over the 1060 and here it is the other way around 8% lead for the 1060.
It might be interesting to do a big write up of all game engines and their strong and weak points for different GPU’s and CPU’s.
Anyway, I like your review and test methodology it is an example of how it should be done!
Thank you sir! It’s always
Thank you sir! It's always interesting to see how things vary.
What other reviews do you see the opposite results? Curious.
I’ve read many reviews today
I’ve read many reviews today and it’s very hard to compare cross sites as your testing, graphs and data are flat out different (better) than all but maybe 1 other site, not to mention settings, and resolution differences. Most sites are still stuck in the world of average, and min fps unfortunately.
Its tough. There are some
Its tough. There are some metrics and reporting that other outlets use that I would like to integrate but just have had time to do.
It’ll be very interesting to
It’ll be very interesting to see how the 580 overclocks with the additional available power.
A 10% overclock could put it even with the 1060.
Agreed. I have some results I
Agreed. I have some results I didn't have time to put it – more likely 3-4% overclocking headroom with our sample.
I don’t think it will
I don’t think it will overclock much, power consumption could go out of control.
The RX580 already consumes around 100W more which is almost double the similar performing GTX 1060.
3-4%! That sample was quite
3-4%! That sample was quite at it’s limit
Seems unlikely based on the
Seems unlikely based on the TDP increase and relatively small clock increase that it will overclock anywhere close to 10%. The 480 didn’t have a ton of headroom and this is basically just a factory OC thats probably using up most of the additional headroom vs the 480
personally would love to see
personally would love to see a 550 review, brand new (to an extent silicon is always interesting, and could be interesting for a cheaper workstation going from an older dGPU or an iGPU to it and seeing the delta.
just my thoughts
Yup! Those launch a bit
Yup! Those launch a bit later. Hopefully I'll get hands-on with one!
So nothing new, it is
So nothing new, it is basically just a factory overclock masquerading under a new model number… AMD even calls it “Polaris 20” according to PcPer’s chart. Seems a bit overzealous for an 80mhz OC.
I’m not even sure what AMD is trying to do with this refresh. The RX560 and 550 are new, so I guess AMD took the opportunity to do a line-wide rebrand to help sell a bunch of low end chips? Is that worth the cynicism thats inevitably generated by the irrelevant upgrades to the 480/470? I feel like this launch says something about where Vega is right now, because if it were me I would have held off on the Polaris re-brand until Vega dropped in order to leverage Vega hype across the brand. How long will AMD go without a high end card? Nvidia has been uncontested above $250 for a year now.
I think the answer to your
I think the answer to your question is they were somewhat backed into a corner to have to rebrand this as it’s clear they don’t have the resources to launch a full top to bottom lineup of chips anymore in a reasonable time frame. If they didn’t create 500 series brand they would have had 3 new chips ( Vega 10,11 and Polaris 12) in the same product naming family as 2 older chips (Polaris 10 and 11). Basically the best of bad options, with RX580 being pretty shameless while RX570 and RX560 at least offer a bit more compelling value this time around.
Surprising that so many are
Surprising that so many are critical of AMD’s rebranding of Polaris. Intel launched a “7th gen” CPU that was essentially a slightly overclocked 6th gen CPU. There were some people calling Intel on the rebrand, but no where near to the degree people are doing with RX500 series.
I’m not saying I approve, but this just seems to be the norm now. Not sure why AMD has to be the company singled out the most.
No argument from me that the
No argument from me that the difference between Skylake and Kaby Lake is so minimal that it is effectively a re-brand. The difference is that stock performance increased for zero power increase. That tells you two things may be going on – they’re getting better bins, or they actually made a process change. The RX580 gives you a small clock speed bump at a tangibly increased TDP… i.e. they just overclocked the card and called it a day.
Consider their relative positions as well. Intel is in a position of strength, and has been for a long time. Criticize them for not pushing the envelope, but unfortunately thats what you get when R&D is expensive and competition is nonexistent. AMD is the clear underdog and has been for years now (speaking about GPU’s now). They need a win, or at least something to claw back mind share, and this is what they give us? After a year of Pascal crushing the high end? They should have been primed to drop a big new card with the node change because everyone knew there would be a huge wave of people upgrading old 28nm cards. Ok, so they lead with Polaris because Vega on 14nm is a bit too risky, but its been a year now. How many sales have been lost to people who waited 6 months and gave up?
Having this crappy re-brand come before Vega feels like a blunder because it just highlights how slow they’ve been to bring out something on the high end. They should have launched 560/560 and gotten us ready for Vega launched under a 5xx moniker like NV did the 750T then brought the 580/570 rebrands out at the same time. Makes me wonder if Vega is still so far out that strategy would not be viable
Well, i really see no point
Well, i really see no point in buying a RX480/RX580 while the GTX1060 6GB has a 5% performance advantage, consumes a lot less power, has a more sophisticated suite of drivers and costs roughly the same. And if you manage to find one of the newer 9Gbps GTX1060, that’s jackpot.
It depends on which games you
It depends on which games you want to play. I’d read reviews that test the games you want to play as this is a fairly small sample size and doesn’t include some very popular titles, like BF1, Total War series, Civilization 6, Deus Ex Mankind Divided etc.
Obviously every review will
Obviously every review will have a unique sample of games to test, but I would counter that the Total War games and possibly Civ 6, don't count as "very popular titles."
Again, to each their own of course. The more data out there, the better.
So I understand they may be
So I understand they may be less popular, but it would be nice to see at least one strategy game in there. You hit a lot of other marks, something with a lot of small units such as a late game Civ benchmark would be an interesting addition (I’d say Ashes, but I worry a bit they put their thumb on the scale for AMD).
For that matter, some sort of compute/creative benchmark would be interesting, too.
http://store.steampowered.com
http://store.steampowered.com/stats/
As I write this Dirty Rally, Rise of the Tomb Raider and Hitman are not even in the top 100 titles currently being played on Steam right now. There are 4 Total War titles on the and 2 civilization games the list though, so I’m not sure what data you are looking at. Also there isn’t a single game on the list from either EA or Ubisoft. I’m not saying that including more titles will change overall conclusions because I have no idea but forming them from a sample of 6 IMO is a bit too small.
Point taken.
However, with
Point taken.
However, with Origin and Uplay out there, the metrics aren't covering all bases.
To the commenter above: I
To the commenter above: I did.
Techpowerup tested 22 games while Anandtech only 9. Aside 1 or maybe 2 titles, we always see the RX480/RX580 behind the GTX1060 6GB. In GTA V, by a large margin. So, nothing changed since the launch of the RX480 and I can make an educated guess that nothing will change in the future, either.
But I don’t really care for the exact amount of performance deficit (does it really matter if it is 4%, 7% or 9%? not for me), so I roughly translated it in a 5% deficit. I rounded it up for not making AMD fans crying. And for making things easier.
What matters is that the RX480/RX580 don’t deliver the same FPS as the GTX1060 6GB. And, as they cost the same amount of money (again, a 5$ or 10$ difference doesn’t really matter) while consuming a lot more power, I really see no points in choosing them instead of a shiny, brand new, GTX1060 6GB.
Haven’t seen any 9GB 1060s,
Haven’t seen any 9GB 1060s, but $750 will get you this 8GB 1060 – https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601205646%20600494828
Newegg got the product
Newegg got the product details wrong: it is a GTX1080.
I know this a lot of work
I know this a lot of work Ryan but any chance we can add more titles to the testing in the future. I think you made good choices of very popular titles, but it’s hard to say if this is a representative sample of the performance gamers will find off the beaten path so to speak. Also AFAIK Dirty Rally is the only title in your suite that didn’t have heavy AMD/NVidia involvement with development.
I honestly couldn’t care less
I honestly couldn't care less about who was "involved" in the games. To some degree its good to know for anecdotal information, but we look at games that are popular, fit a niche (getting some first person, third person, racing, etc. games in there) and push the GPU a bit more than others.
But to your point, Hitman was definitely an AMD-centric title, as was Dirt Rally.
I was really hoping this
I was really hoping this iteration of 14nm could squeeze out a few more Mhz without the TDP. I thought at the time that the RX480 could have been hindered by processing challenges that could be resolved in this v2 of the chip. Sadly, it seems polaris 10 (20) is at its limits.
P.S. just noticed, no more anonymous posts? (I like it)
It’s still possible the
It's still possible the process is partly responsible, but it just hasn't been refined enough for us to see any differences.
i appreciate the re-branding,
i appreciate the re-branding, even if it is only a small deal, as it helps me to know what I am buying
rather than guessing whether this is a revised 480 or the original when I go to buy, i like being able to know which revision i am buying, so when i get a 580 i know what i am getting
so much more civil and pleasant without the anonymous posts
thanks so much
You make a good point as to
You make a good point as to product differentiation. While products based on the same core can be dismissed as 're-brands', it is also fair to say that providing a clear differentiation to the end-user is valid. I think the question becomes whether introduction of a new series (5xx from 4xx) is required for faster SKUs, or would a variant product name (RX 480X or RX 485 for example) be enough?
sebastian, even V2 would have
sebastian, even V2 would have been sufficient
i recall mobos used to have version numbers, which was essential to know
but i don’t fault them at all as i believe lisa and raja are about the most humble and honest and brilliant people amd has had running the place for quite some time, and i don’t get that they are being deliberately deceptive with the naming scheme
it’s just fuckin numbers anyway, but i guess the community takes it all very seriously
lg and samsung release a new tv each year, usually with only marginal changes from the year before, but they refer to the models by year, as you know, and I certainly appreciate knowing that
it is reasonable for the change to 500 to distance it from all the bad publicity the 480 got with the power distribution issue, don’t you think?
i just appreciate that companies like amd and intel and nvidia and the rest make such cool shit for me to enjoy
i am so fucking lucky as far as that goes, so they can call the shit whatever they want as long as i know what i am getting
i recently decided to try an induction cooktop, single burner, because i hate using the fancy electric range in my very high end apt in austin, and once again, all i can feel is how crazy cool it is that I can own such awesome tech, and for only $125 or so
that goes for all my stereo equipment and my japanese ceramic knife, which is my favorite tech of all
so with all this amazing goodness which i don’t deserve, how can i fault these folks for some numbers
The big question is really at
The big question is really at this point is if this card a valid upgrade path? For example, I currently have a R9 Fury from sapphire which I got on sale at a great price point. Assuming I wish to stay with AMD, is the 580 an option for an upgrade? From the power usage side of things one would say yes it’s hard to say about performance.
Again, great article and truly enjoy the site!
If you didn’t think the 480
If you didn’t think the 480 was a worthy upgrade, then the 580 isn’t either. If you want a GPU that occupies the same place in the product stack that Fury occupied when it was initially released, you can’t get one from AMD and should wait for Vega if you feel compelled to stay AMD
Yeah, in your case, if you
Yeah, in your case, if you aren't considering NVIDIA, you should wait.
After looking into many
After looking into many reviews of this Polaris refresh, I’ve gotta say I’m extremely disappointed with AMD for this one. I fully understand the market they are chasing with this refresh/rehash of GPU’s. I sometimes scratch my head with re-brands 99% of the time, I sorta find these cards somewhat misleading. Let’s take a RX 480 and compare it to a RX 580, well they are pretty much the same thing with a slightly higher base clock on the core. Now at a slightly lower price point is nice upgrade, for someone still using a ~380X range of GPU’s (AMD or nVidia). Perhaps I’m disappointed because these GPU’s wouldn’t be a worthy upgrade for me… I suppose I’m trying to make a point of, please do not release a re-brand (literally) and focus on the good stuff aka Vega.
Awesome review as always and have been a long time reader. So GG Mr. Shrout.
Thanks!
I am 100% in
Thanks!
I am 100% in agreement that had this refresh come with a $30-40 price cut, they would sell a lot more of them and possible cut into the market share of NVIDIA's 1060 product line.
I would wager a guess that
I would wager a guess that with a $30-$40 price cut they would be selling these cards at basically break even. At <20% market share that is something you might have to do over the short term to stay relevant or die. At ~30% market share sustainable profitability becomes the driving factor.
No overclocking? Trying to
No overclocking? Trying to get a feel if 1500mhz will be close to the average OC.
Radeon Chill can drastically help with power usage, but I get that many people probably don’t care, or don’t play a game it supports.
Far as the 580 vs 1060 goes I’d probably still stick to the 580 for the bit of extra VRAM, but mostly for free-sync cash savings.
Although don’t a majority of the games you tested here lean towards nvidia cards in general?
Following that, sad to not see DOOM Vulkan, a 480/580 touches nearer a 1070 in that game, DX12/Vulkan needs to get here sooner if they can all be that well optimized.
While it is great that you
While it is great that you have the gear to measure the power draw of the card precisely, I still would like at the wall measurements. Nvidia’s driver puts a much heavier load on the CPU, so you aren’t getting quite the full story just measuring the power consumption of the card. I don’t care too much about load power anyway, but it isn’t that hard to get some at the wall measurements. I care a little bit more about idle power but electricity is cheap. I replaced three 75 watt incandescent bulbs in my kitchen with LED bulbs at 15 watts each. So I saved 180 watts for lights that are on for hours every day. I didn’t notice any meaningful difference in my electricity bill.
Apparently the MSI comes
Apparently the MSI comes “over-voltaged”.
It can actually run at 1393mhz at lower voltages, pulling less power!
https://youtu.be/MQ9ro5pwfXY
Also, it would be nice to test the benefits of chill.
Lets assume that a gamer with a freesync monitor with a refresh rate of 75hz doesn’t gain anything in having a card pulling 100fps or 80fps. Assuming that the play experience would be the same if chill is looked at 75fps, how much would chill reduce the consumption?
Ryan & Jeremy,
I just got my
Ryan & Jeremy,
I just got my Tower back from the shop with a RX 580 – 8 GB installed along with 2- 10 TB WD GOLD’s and a complete checkup. It is a SandyBridge with Cougar chipset, i7 2600k CPU Intel Motherboard, never overclocked, 32 GB memory and 2- 256 GB 850 EVO’s for the OS partitions of 3 and 4. The extra one is for a foyer into BSD/TrueOS. The other 6 are for a custom System Recover/Boot master, Main: W 8.1 Pro, Test: W 8.1 Pro, WIP Fast: W 10.0 Pro latest Build and WIP Preview up to date 1703, all x64. The PC is fast and unencumbered by by post Broadwell shenanigans or UEFI/Smart Boot.
I really don’t like playing computer games, if anything I prefer to watch others play, but I do love Trains so I am into Dovetail’s Train Simulators’which is where it became necessary for the GPU Upgrade. The Radeon 6880 was just not cutting it anymore.
Have I missed something? I watch you and Jeremy and the boys on my 55″ HDR Sony TV every week and read your Blogs, so I was shocked to see the lack of W 8.1 x64 Drivers. I see W 7 and W 10, which I have justified to over W 7. I do not recall anything mentioned about this limitation, WTF?
Is there any guidance you or the boys can give me in this matter? I am still setting things up but I have managed to get the April drivers in, in place of the MS Generic which was a Bear to accomplish compared to yesteryear! I tried out the 2 DT Sims very briefly last night and the was a fanominal difference to the graphic.
Best Regards,
Crysta
Do you mean the graphics card
Do you mean the graphics card driver? They can be found here http://support.amd.com/en-us/download
Hi Jeremy,
Did you look at
Hi Jeremy,
Did you look at that page closely? I have been looking there and on MSI, they are both the same:
Radeon™ RX 500 Series
Windows 10 (64-bit)
Windows 7 (64-bit)
RHEL / Ubuntu
Latest Windows Optional Driver
No W 8.1 x64!!!
Thanks for responding however,
I also gather, this is the first you have heard of this conundrum too.
Best Regards,
Crysta