Power Consumption and Conclusions
The Radeon R9 285 card we used in our testing was able to run at 19 watts lower than our tested Radeon R9 280 card. Keep in mind that because ALL of these cards (including the R9 280X and R9 280) are overclocked a small bit out of the box, they might have had their voltages slightly adjusted as well. AMD did NOT ship out reference R9 280 boards and neither did they send out reference R9 285 cards. The result is that we aren't 100% sure that these results would be the same on reference platforms, but this is how it shakes out in the real world.
Also note that the R9 280 uses a few more watts than the R9 280X – this test was verified and repeated about 5 times to verify. Even with all that though, seeing our Sapphire Radeon R9 285 use noticeable less power to perform *nearly* the same as the R9 280 is a good signal for AMD's GCN architecture.
Pricing and Availability
AMD claims the Radeon R9 285 will be available today at all the normal retail and e-tail establishments you usually buy hardware from. In fact, a couple of models are already available on Amazon.com, both from Sapphire.
- Radeon R9 285 2GB – $249
- Radeon R9 280 3GB (EOL sales) – $219
- Radeon R9 280X 3GB – $289
- GeForce GTX 760 2GB – $244
If you already have a single Radeon R9 280, now might be the perfect time to start looking for a second card for CrossFire action if you are interested in that kind of thing. Once they are gone, you are going to be out of luck. The R9 280X would be a compelling option with its added performance and larger memory size if we didn't really already know, in our heart of hearts, that it was on the way out the back door as well to make room for a FULL Tonga GPU in the not too distant future. The GeForce GTX 760 has a lot of great features and lower power consumption, but in terms of raw performance per dollar, it can't hold up to any of these three options from AMD.
Never Settle: Space Edition Update
The AMD Never Settle bundle program continues to roll along and with the release of the Radeon R9 285 the company has improved things yet again. Added to the list of available games that Radeon buyers can select are Alien: Isolation, a Star Citizen Module pack, Habitat and a host of other older titles. Buyers of the R9 285 will qualify for the Gold Reward level, allowing selection of three of the titles.
Just a note: the Star Citizen inclusion here is centered on Arena Commander and Murray Cup Race Series modules. We aren't talking about the full game, which has a yet-to-be-determined release time frame anyway.
NVIDIA continues to battle in the bundle department, recently adding the new Borderlands Pre-Sequel to its quiver.
Final Thoughts
AMD's Radeon R9 285 is a evolutionary design in a few ways. It is a further enhancement of the GCN architecture even when compared to the design of the Radeon R9 290 and 290X with improved 4K video support. AMD was able to tweak the ISA and improve compute efficiency with the Tonga chip. It's evolutionary for the product line as well, including support for XDMA CrossFire, TrueAudio decode and FreeSync technologies give the R9 285 has much more a formidable feature set over the Radeon R9 280.
Some users might see it as a step back with the return to a 256-bit memory bus width and 2GB of frame buffer. Those changes likely force some small performance penalties on the GPU itself but the truth is the Radeon R9 285 performs basically identically to the R9 280 that it is replacing. Yes it loses in a couple of games (BF4 and Crysis 3) but it wins in a few as well (Bioshock Infinite and Metro: Last Light) making that aspect a draw.
We will likely see several other GPU replacements coming down the pipe as 2014 progresses and though the Radeon R9 285 is a fine graphics card, it just doesn't do anything exciting or revolutionary. We don't have a big price drop (the R9 280 was already selling for $250 or lower before this release); we don't see a big power consumption drop (there is a small one though); we don't see dramatically higher performance. The only weight behind the release comes in the form of those features that were missing from the R9 280-series of GPUs like TrueAudio and FreeSync. TrueAudio will appeal to some gamers but others just won't care about audio as much as they do graphics. The FreeSync support is great and probably the most important "feature" of this particular launch.
As unexcited as this conclusion might sound, the Radeon R9 285 2GB still looks to be the best graphics card you can buy for $250. But AMD already held that crown making this seem more like a maintenance release than a high impact technology launch. If you value the technology of Mantle, TrueAudio or FreeSync and are in the market for a graphics card in the budget of $210-280, then this is the one for you.
Best Performer at $250








Ryan – is the 250w TDP listed
Ryan – is the 250w TDP listed in the table on page one, for the 280 correct? I thought it was 200w?
As soon as the Asus Strix version of this hits, I think I’m finally upgrading.
Ooops, yep, you’re right!
Ooops, yep, you're right!
A few days back I was really
A few days back I was really confused with power consumption of 280 with others mentioning 250W and others 200W. From a little search 280 is indeed 250W, 7950 Boost was 225W and the first 7950 was 200W.
Maybe have another look about these numbers?
AMD’s
AMD’s slide
http://cdn.videocardz.com/1/2014/03/AMD-Radeon-R9-280-4.jpg
Also in the same table the
Also in the same table the values listed for the “Peak Compute” seem potentially misleading. The quoted values for Tahiti GPUs are for the minimum/base core clock speeds, not the “Rated Clock”, boost, “up to”, max, or whatever you want to call them clock speeds. I would question using “Peak” to classify these values. Do you happen to know if the compute value for the R9 285 is also at it’s base/minimum clock speed? If so, what is it’s minimum clock speed?
If we interpret “Peak Compute” to be theoretical single-precision FLOPS at max/”up to” clocks, then a Tahiti PRO, aka. 7950/R9 280, @827MHz = 2.964 TFLOPS, but @933MHz = 3.344 TFLOPS. Similarly, a Tahiti XT, aka. 7970/R9 280X @850 MHz = 3.481 TFLOPS, but @1000MHz = 4.096 TFLOPS.
This points to a bit of detail that is not usually provided in GPU reviews even after the whole R9 290X/290 launch drama over thermal/power throttling, UBER mode, etc. This was a real problem with my Sapphire 7950 Boost card too. Do you know if the R9 285 was throttling during testing or if it was able to maintain boost clocks throughout? Do you know what voltages are being used?
My 7950 Boost BIOS pumped 1.25V into the GPU in boost mode which caused obvious clock speed throttling under stock settings. Maxing out Power Tune cured most of the throttling but it still ran hot due to the (excessive) voltage. Switching to a non-Boost BIOS allows me to specify a lower 1.169V and run 150MHz over stock (1075MHz) without any throttling. This also lowers max temperatures and noise.
http://www.techpowerup.com/re
http://www.techpowerup.com/reviews/Sapphire/R9_285_Dual-X_OC/26.html
Tonga is very inefficient.
Is it? According to GURU3D,
Is it? According to GURU3D, it consumes 52 Watts less than the 280x despite being almost as fast.
http://www.guru3d.com/articles_pages/amd_radeon_r9_285_review,5.html
This is the problem with
This is the problem with power consumption numbers. Depending the game and HOW you measure the power draw, the variance in results can be HUGE.
I think AMD model numbers and
I think AMD model numbers and re-branding are as much to blame as anything for the confusion. The R9 285 (190W TDP spec) performs on par with (and is meant to replace) the R9 280 (250W TDP spec). A 60W drop in TDP would appear to be a vast improvement in efficiency, but reviews seem to imply that the difference at the wall is much smaller (~20W) once you account for OC vs stock clock versions.
In general, TDP is proportional to electrical power used, not equal to it. Could this difference in TDP spec compared to power at the wall imply that Tonga generates proportionately less waste heat rather than consumes less electrical power compared to Tahiti?
Few review sites have comparable numbers for all incarnations of Tahiti-based products. For example, there is a noticeable difference in power usage between a 7950, a 7950 Boost, and an R9 280 due to voltages and clock speeds. At least they are all based on the same silicon with the same amount of VRAM so power comparisons are easier to make. Now add Tonga to the mix which has a similar die size, 700M more transistors, 33% smaller memory system, and requires lower voltages to achieve higher clock speeds from what I have seen. There are many variables. Some should reduce power needs and some increase power needs.
AMD seemed in no hurry to release the R9 280 in the first place. With Tonga, AMD is essentially back filling their product stack to bring feature parity (FreeSync, XDMA, TrueAudio, etc.) to this price point. It is disingenuous for AMD to promote these features and then keep re-branding old GPU’s that don’t support them. Being stuck at 28nm for this long probably forced some awkward/bad release cycles. Tonga feels like a careful balance of compromises which are unfortunately late to the party. At least it provides a way to fill some pot holes and test out additional tweaks to the GCN architecture.
tomshardware reports around
tomshardware reports around 178w on typical gaming and maintains below 200w when overclocked
Why is not registered if the
Why is not registered if the DP 1.2a supports ???
displayport 1.2a was thing
displayport 1.2a was thing for AMD’s adaptive sync which nvidia won’t support.
While it might be slightly
While it might be slightly more power efficient, it’s nowhere near what Nvidia achieved with the 750 Ti. Granted these are two different performance classes so it remains to be seen if Nvidia will achieve the same in a performance class card.
The bandwidth efficiency and tessellation performance increase is nice to see, and promising for future cards. But if AMD can’t improve power efficiency I don’t think that’s going to help much.
That said, Nvidia’s offerings at this price point suck pretty bad, so I guess 285 wins by default. I say get a 280 while there’s still stock left. I’m guessing they’ll also bring out the full Tonga 285X soon that’ll beat the 280X in a similar way.
Less you are dieing for a gpu
Less you are dieing for a gpu now, likely best to wait for nvidia’s new cards to come out and see what performance comes outta then before making a choice. Worst case drives prices down some.
3 years later and we are
3 years later and we are still getting the same 28nm parts. Meanwhile Intel has stopped increasing in performance because AMD can offer nothing anymore. This has got to be the worst stretch of stagnation for the PC hardware world I can remember…
it is true that intel and amd
it is true that intel and amd have stopped reducing size in chips but this is mostly do to the fact that at the area of 8-10 nm quantum tunneling occurs and chips become more and more inefficient.until they come up with another material like graphene to replace silicon we will not see chips much smaller than 10-15 nm.this being said AMD still has some way to go befor 18nm. the main reason they have not done so yet is the fact that amd chips (gpu and cpu) are underrated by most people. meaning they don’t have as much funds to pay for R&D and nvidia as well as intel dont have a reason to push R&D as AMD is still trying to catch up.(AMD made 83 mil last year in the cpu and gpu market, intel made 10 bil).thought nvidia is planning a 18nm acturture for 2015-16 ish this being maxwell.IBM is doing R&D right nor for graphene chips although it seems its a far way off 2020 maby.
sorry for all the shitty grammar.
intel is working on smaller
intel is working on smaller die’s AMD well not so much they rely on big fabs to do it.
Dude double patterning is
Dude double patterning is hard, give them time. It is not conspiracy, it is just that hard to manufacture at that level when key technologies as UV lithography are not available.
Even Intel will hit the
Even Intel will hit the Gigabucks process node wall before the laws of physics put an end to Moore’s law/observation, it’s costing almost geometrically more with each new process node shrink, so the R&D costs curve verses the ability to amortize these costs over total units sold, is going to rapidly approach the untenable, even for Intel’s big wallet. Why do you think Intel is slowing the Tick Tock, well that and some lack of competition in the x86 desktop market. Going below 14nm, and not having a large enough market for its x86 based parts in the mobile market, is making those amortization curves say no, that and Intel’s Internal bean counters/quants in the accounting division, and stockholders/investors, and the big institutional investors, that hold Intel’s stock(big institutional investors that are accustomed to high dividends at the expense of R&D/whatever, or they will leave for greener pastures).
Intel is coping with an x86 based ISA ecosystem that has hardly any share of the Mobile devices market, and the future licensed Power8s coming in the server market, watch Intel’s share price, once Google announces a firm commitment with the Power8 systems that Google is currently evaluating, that along with the ARM server SKUs beginning to arrive, and compete(no worries there for the ARM based server SKU makers) with Intel’s Avoton(Discontinued/rebranded). Intel is facing a loss of market domination(more so because of different ISAs, used in mobile, and future high power non x86 ISA competition coming to the server room, along with non high powered ARM based densely packed server SKUs), and the x86 based unit sales, that allowed Intel to spend its way to process node leadership are now stagnating, see the empty chip fab buildings.
Intel’s x86 still dominates in the PC/Laptop market, but the ISA that will/continue to dominate the netbook/chromebook, and tablet market is non x86, and comes with better graphics, and Apple, Nvidia, and soon AMD will be introducing more powerful Custom wide order ARMv8 SOCs, with better graphics, the Tegra K1’s graphics is the leading example currently, of why Intel will not be able to sell enough of x86 to make the investments in below 14nm process node pay off as quickly as 22nm, or 32nm before that. The entire silicon based economy is rearranging around dedicated foundries providing fab services for the entire industry, and Intel is one of the last CPU manufacturer holdouts that will still be holding its own fab capacity, and the affordable process node lead for even Intel is drying up fast, without the total unit sales to keep those expensive chip fabs (Uber expensive at 14nm and below) running at full capacity. The GPU makers will be getting the most out of 28nm, before they go to 20nm, as the costs to go below will have to be shared by more companies, than just AMD, or Nvidia, to make it cost effective to go below 20nm. Look for more FINFET and die stacking, and less shrinking, going forward.
zzzzZZZZzzzZZZzzzzzzzZZZZZZZz
zzzzZZZZzzzZZZzzzzzzzZZZZZZZzzzzz
missing old Anand
bzzzzZZZ DIE AMD just die frrrrrfrrrr
AMD will never die, they will
AMD will never die, they will just go 3 ISAs, ARM, x86, and Power8(Rory still has that IBM time under his belt), And SeaMicro(AMD owned) sells Xeon server SKUs, to go along with the Opteron(ARM, and x86 based), and SeaMicro will be selling Power8 based systems too, once those licensed Power8s start hitting the market, Most likely AMD will license the Power8, like the ARM, and profit more than 3 ways. AMD’s SeaMicro selling Xeon based Kit! business is funny that way, when that’s what the customer wants, and there’s money to be made.
Thanks for the review. I was
Thanks for the review. I was just looking for a card to upgrade form my old HD7770, this seems like a reasonable choice, good value for the money. But I’m a bit concerned about somewhat poor performance of this card with mantle I’ve seen in other reviews, hope it is only driver issue that will be resolved. Have you done any testing with mantle?
Wait for Nvidia Maxwell GTX
Wait for Nvidia Maxwell GTX 960 in October.
well rumors say end of this
well rumors say end of this month if holds true well.
http://www.fudzilla.com/home/
http://www.fudzilla.com/home/item/35657-amd%E2%80%99s-freesync-only-for-new-gpus
Ryan Shrout
You do not know how to ask the right questions
” AMD said that the only the newest GPU silicon from AMD will support FreeSync displays ”
” the Hawaii GPU that drives the Radeon R9 290 and 290X will be compatible with FreeSync monitors ”
what is compatible ???
http://support.amd.com/en-us/
http://support.amd.com/en-us/search/faq/219
All AMD Radeon™ graphics cards in the AMD Radeon™ HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes. The AMD Radeon™ R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.
AMD APUs codenamed “Kaveri,” “Kabini,” “Temash,” “Beema” and “Mullins” also feature the necessary hardware capabilities to enable dynamic refresh rates for video playback, gaming and power-saving purposes. All products must be connected to a display that supports DisplayPort Adaptive-Sync.
Tonga will support it too.
I asked AMD
Customer Service
I asked AMD
Customer Service says not support DP 1.2a
Video card can connect to the screen with DP 1.2a
But not working full standard
what is compatible ???
Sit on a chair is compatible
What lenovo laptop is that
What lenovo laptop is that Ryan is using?
Inteel make approx 60% margin
Inteel make approx 60% margin om chips why eat into that to salve a few noddy enthudiasts wprth at most 100m in sales?
they have tofillcoffer to pay big fines
Are you guys gonna test this
Are you guys gonna test this card in crossfire?
Ryan don’t forget that
Ryan don’t forget that according to AMD. The R9 280 will have DirectX 12 support and more hardware support for future updates. AMD was implying something more was coming during the interview with ORIGIN PC and AMD.
Direct X what???
Just
Direct X what???
Just sayin…..
Direct X?? lol haha *cough *ahem ahhhhh lol
Edit.. Im happy that Devs are finally gonna START using Dx11. Thx M$, Sony, AMD, x86, and the entire Console Ecosystem as a whole lol. Preciate the love….FINALLY
Sorry the R9 285 will have
Sorry the R9 285 will have DirectX 12 support and more goodies are coming.
FreeSync…the Sync that Sunk
FreeSync…the Sync that Sunk
Ryan.
I was wondering if you
Ryan.
I was wondering if you are thinking of doing a review on AMD R9 285 with 2 cards in XDMA mode.
To compare similar cards in Crossfire and Nvidia SLI, when you get more samples of R9 285.
And when or if there’s going to be any monitors with FreeSync, or at least of any rumors.
Also I seen a few photos of a new AMD Socket & CPU At 5.233 MHz with 6 cores do next year.
6 Mg. L2 and 24 Mg L3 cache & rated at 190 W and thermals at 24 C. and it is not a Piledriver.
Is this just a rummer or is it in the works.
What should I buy R9 270x @
What should I buy R9 270x @ 169.99 or R9 285 $249.99
I play Diablo 3 and I am upgrading from HD 5770 🙂
Please help.
Thanks
My R9 285 is constantly
My R9 285 is constantly dropping GPU core clock and GPU usage when gaming or testing, this is normal?