A Powerful Architecture
AMD has done the impossible and put a pair of Hawaii GPUs on a single PCB to create the most powerful video card we have ever tested.
AMD likes to toot its own horn. Just a take a look at the not-so-subtle marketing buildup to the Radeon R9 295X2 dual-Hawaii graphics card, released today. I had photos of me shipped to…me…overnight. My hotel room at GDC was also given a package which included a pair of small Pringles cans (chips) and a bottle of volcanic water. You may have also seen some photos posted of a mysterious briefcase with its side stickered by with the silhouette of a Radeon add-in board.
This tooting is not without some validity though. The Radeon R9 295X2 is easily the fastest graphics card we have ever tested and that says a lot based on the last 24 months of hardware releases. It’s big, it comes with an integrated water cooler, and it requires some pretty damn specific power supply specifications. But AMD did not compromise on the R9 295X2 and, for that, I am sure that many enthusiasts will be elated. Get your wallets ready, though, this puppy will run you $1499.
Both AMD and NVIDIA have a history of producing high quality dual-GPU graphics cards late in the product life cycle. The most recent entry from AMD was the Radeon HD 7990, a pair of Tahiti GPUs on a single PCB with a triple fan cooler. While a solid performing card, the product was released in a time when AMD CrossFire technology was well behind the curve and, as a result, real-world performance suffered considerably. By the time the drivers and ecosystem were fixed, the HD 7990 was more or less on the way out. It was also notorious for some intermittent, but severe, overheating issues, documented by Tom’s Hardware in one of the most harshly titled articles I’ve ever read. (Hey, Game of Thrones started again this week!)
The Hawaii GPU, first revealed back in September and selling today under the guise of the R9 290X and R9 290 products, is even more power hungry than Tahiti. Many in the industry doubted that AMD would ever release a dual-GPU product based on Hawaii as the power and thermal requirements would be just too high. AMD has worked around many of these issues with a custom water cooler and placing specific power supply requirements on buyers. Still, all without compromising on performance. This is the real McCoy.
At the heart of the AMD Radeon R9 295X2 is a pair of full specification Hawaii GPUs. These are not neutered in any way. Each chip includes the same shader count as an R9 290X. With its two gpus, this card pushes 5,632 shaders to a theoretical peak performance of 11.5 TFLOPs. A combined 12.4 billion transistors is quite impressive.
Each GPU has access to 4GB of GDDR5 memory and are connected to a 512-bit memory bus running at 5.0 GHz; again, this is the same clock rate as the single GPU card.
The GPU clock speed is labeled as “up to” 1020 MHz and I can say that in this instance, the R9 295X2 is running much closer to that top clock than the original reference R9 290X. The rest of the specifications show your general duplication of the single GPU cards – 352 texture units, 128 ROPs, 512 Z/stencil units, etc. Obviously, as with all dual GPU cards, performance isn’t going to scale 2:1 as we have the efficiency of CrossFire to deal with.
GPUs used on the Radeon R9 295X2 have some different properties than those used on the R9 290X. For one, they are using different target temperature of 75C versus the 95C we saw with the first wave of Hawaii parts. Partly because of this, GPU clocks should be much more consistent and the card to card variance we saw with reference R9 290X/290 should be a thing of the past.
To facilitate the XDMA PCI Express communication for the R9 295X2 a PLX bridge chip sits next to the primary GPU. This allows each GPU to get a full x16 allotment of bandwidth for chip to chip communication but only require a single x16 connection from the host PC. Again, this is common practice for dual-GPU cards in recent memory.
What is not common practice is the cooling solution used by AMD. The R9 295X2 has a 500 watt TDP and as such your standard two slot copper cooler was not going to cut it. We have already documented problems with AMD’s reference cooler for the original R9 290X and R9 290 and the company wanted no part of a repeat performance. For this card AMD has gone with a water cooled solution – a first for a reference design out of either AMD or NVIDIA.
On each GPU sits an Asetek combination water block and pump, connected in series to each other, then outlet to a radiator and fan through a sealed water loop. Enthusiasts have been using configurations like this on their processors for quite some time and the idea is essentially the same. The closed loop means installation is easy and the system is maintenance free while the water blocks allow for MUCH better thermal dissipation than you would get with standard copper heatsinks. The fan on the radiator is controlled automatically and adjusts based on the temperature of the fluid in the cooling system.
Some users, and other companies, might lament that fact that this cooler is more complicated than any other graphics card on the market before. While true, the differences installing a standard card and the R9 295X2 are minimal. All that really changes is a new set of four screws required to mount the radiator to the back of your case. Because the water blocks are already mounted on the PCB and even the fan is pre-mounted to the radiator, this is about as simple as you can get for water cooling. Any users looking to spend this kind of cash on a GPU should be more than willing to install it.
To help cool the memory and power delivery on the R9 295X2 AMD has included some heatsinks and a central fan. This fan speed will be controlled by the current going into the power regulation systems.
The card itself is quite long – longer than the GTX 690, longer than the HD 7990 and about on par with the MASSIVE ASUS ARES II card. At 307mm in length, you are definitely going to be limited on case installation if you are trying to squeeze this into any kind MicroATX or MiniITX chassis. The radiator is 64mm deep with the fan installed as well, another dimension to keep in mind. Water tubing length is 380mm so you’ll able to stretch the radiator to the back of most cases (or top should you go that route) without an issue.
At the far end of the card you’ll find just a pair of 8-pin PCIE connections to get this card up and running. That seems oddly…optimistic. More on that on the next page when we discuss installation requirements.
In terms of looks and styling, the R9 295X2 is much better than that all-plastic design of the Radeon HD 7990. AMD has also included some little touches that help it at least LOOK like it is worth the dough you shelled out for it. The metal cover for the coolers and heatsinks, along with the rivet styling, give the 295X2 an industrial look that rivals that of the GTX 700-series cards. The back plate on the card helps protect the components but also continues with the “high class” look.
The Radeon logo up top glows red as does the fan in the center and when powered on in a dark room or case, the red light definitely demands some attention.
Solid as Hell..
But….Where
Solid as Hell..
But….Where is the temperature page?
That was the first page I was
That was the first page I was looking for … lol, how embarrassing.
The problem is that GPUZ
The problem is that GPUZ didn't recognize the 295X2 at all so we couldn't monitor in real time. And using the AMD control panel to monitor it seems like a sketchy idea. Once the software updates we will include it.
I suppose this is the reason
I suppose this is the reason we see so many thermal imaging cameras used with early gpu reviews these days.
Nvidias Frame Times in BF4
Nvidias Frame Times in BF4 are surprising, not so long ago it would have been AMD who looked that bad, not AMD look good and smooth while Nvidia look wanting.
What a turn around.
Well done AMD.
That was the article I first
That was the article I first went to, wow… Amd has stepped up its game with its drivers(Thanks to Ryan’s Frame “Raping” analysis and sites like techreport).
I remember The reports by tech websites, of countless headaches with the drivers crashing during testing when it came to any dual GPu solution by AMD.. seems to have been nulled now.
Great to see the results of this beast both for its hardware visual eyecandy and test results. Thanks for the review
I will admit.
I WANT THIS.. I
I will admit.
I WANT THIS.. I would sell my R9 290x and GF 680 and few SSD’s to get this… Can’t wait till it is “available” in Poland. Though I expect another price war between Nvidia vs AMD. As both cards are overpriced.
Still I’m really happy that the cooler on the R295x2 is not holding the card. Especially on noise lever (would be nice to compare this to the stock R9 290x not the ASUS with aftermarket cooler ;))
Just look for the Asus R9
Just look for the Asus R9 290X review and see it compared to the stock cooler
is this going to be the prize
is this going to be the prize for the NCAA bracket challenge winner?
I want to see this card under
I want to see this card under water for real, that would be a much more compelling option than the integrated radiator which just creates clutter in the case(not that I would ever buy a $1500 card). I would like to see if AMD could have gotten away with a triple slot card like Nvidia did with the Titan Z. Triple slot with triple fan like say the lightning cooler surely would have been capable of cooling it. Other than those comments on the card, it was an excellent review but definitely out of my price range
Maybe I missed it, but is
Maybe I missed it, but is there a list somewhere for the quality settings used in these benchmarks, type of AA or level of quality type thing?
Looks like it was left off
Looks like it was left off the BF4 page. Ultra preset.
2 295 being cooled by that
2 295 being cooled by that tiny rad o_O
GPUs, in nature don’t produce
GPUs, in nature don’t produce as much heat as a CPU. This is why a 500W GPU can be run on such a rad. A 500w CPU would need a rad 9 times that to even make it functional (if it didn’t boil the coolant dry as soon as it gets to the CPU).
Would love to see this
Would love to see this released with integrated waterblock for those of us who like home grown water systems not closed loop. I could put this on a 200x200mm radiator and noise would be even lower
I was thinking the same. But
I was thinking the same. But regardless, well done AMD!
This reminds me of the last
This reminds me of the last ares II. Remember the dual 7970 with air/liquid hybrid cooling? The problem with that one turned out to be not with the design but with crossfire. I was expecting Asus to do a ares III but it looks like amd beat them to the punch.
no mention of mantle? Was it
no mention of mantle? Was it tried in BF4?
Mantle isn’t going to make
Mantle isn’t going to make much of a difference with high end GPUs.
Very Good review Ryan !
I
Very Good review Ryan !
I prefer seeing graphs of performance to simple bar charts for the Games – was a bit hard to get used to initally though.
Best 295 X2 review out today !
I don’t think the power
I don’t think the power requirements and the need to understand how your psu outputs power are that big a deal. I’m not in the market for a monster like that but I know off the top of my head that my corsair tx750 has a single 61 amp 12 volt rail. Looking forward to those temperature numbers and I want to see this thing put in to a mini itx system.
Agreed, although. I am one of
Agreed, although. I am one of those that modded a late Powermac G5 1kw PSU to be ATX compatible. So what I know is different to others. I know of people who just wants to buy things and just be able to use it, then they come to someone like me, and be like, can you please make this work.
man the distance AMD made in
man the distance AMD made in these last short months, i remember the nightmare of temps, noise, throttle, frame times.
really great job from AMD here, no doubt the best gaming card for the up coming 4k gaming Boom.
i am angry at Nvidia though, their announcement of 3000$ TitanZ probably sent up the price of the 295x 200-300$ more, which will be probably the price drop after 790Ti comes out about 1600-1800$.
i just hope miners stay away from the 295 or they will push it’s price to catch up the titanZ.
also it’s about time for dice and eidos to add mantle crossfire support
Why is this being compared to
Why is this being compared to 780Ti SLI but not 290X CF (or even 290 CF)?
This goes for performance graphs, temps, and price comparisons in the last section. No mention of 290(X) CF, which would be much cheaper even at MSRP if you’re willing to deal with the loudness.
Yeah, I wondered that, but
Yeah, I wondered that, but then again, this card is exactly 2x290x, so a crossfire setup would be equal to this one, except in temperature and noise. But a final conclusion of “This card is great for what it is, but if you ‘need’ this level of performance it would be much cheaper to buy two 290X cards and put them in crossfire yourself. However, if space is limited to two slots and you want to own a special edition part, the 295X2 is unbeatable.” would seem to fit.
Nvidia got owned even while
Nvidia got owned even while raising the temp on there cards with the new driver so they can run faster, hotter & louder.
http://www.hardocp.com/article/2014/04/08/amd_radeon_r9_295x2_video_card_review/12#.U0RdNGdOWUk
Lets see if Ryan looks into it like he did the AMD drivers. I wont be holding my breath.
Why would I care. 1019 vs.
Why would I care. 1019 vs. 1006? You do realize it’s free, causes no failures (yet?…LOL), and is merely a driver update away right? You do realize 13mhz out of 1000 is not even 1% right? It’s like a 8th of a SINGLE percent you are whining about. Do you really think anything under a percent (or in this case 1/8 or so of a percent) is going to ruin your world or something?…ROFL.
Fanboys never cease to amaze me. We are talking about almost NOTHING here. As long as they run at most the same temps AMD’s cards are set to why do you care? They appear to be just leveling the playing field a bit on temps. AMD is NOT below 87c for temps right? Then I kindly ask you to go away quietly troll 😉
Note to NV – Please raise it to exactly what AMD is and then say you did it because well, they did it before us. 🙂 All sides are guilty of crap over the years, get over it. At least this affects ALL games, not just a few MANTLE games (right now TWO, and one just plain sucks IMHO, even thief isn’t too good depending on who you ask).
Hey Ryan, I have just benched
Hey Ryan, I have just benched Grid 2 @ 4K with SLI Titans (stock Asus run) and a 3930K @ 3.8Ghz and my averages were 86.47. That seems very low for the 780Ti’s in SLI, of which I consider faster than the Titans. Any reason your frames were so low, and only managed 75 on the Ti’s?
I copied your settings and ran the in-game bench.
Am i the only person that
Am i the only person that thinks the power requirements are fine?
I actually think that’s really cool you have to figure out if your PSU is competent enough to handle this.
It kind of plays to the enthusiast/tweaker types. As well as gives off the impression that it is extremely powerful, without even using it. Perhaps that is just me.
This also just got me WAY more excited for 4K. AMD and NVIDIA now have a single card that can quite comfortably play the most demanding games, with the highest settings, at 4K.
On another note. Holy crap, do those frame times look fantastic. AMD has dome a really great job improving them. Still really sad to see Skyrim is such a mess though. It would be really interesting to see driver comparisons on how drastically they have improved over time.
There still isn’t a single
There still isn’t a single card that can play the most demanding games on the highest settings at 4K. If you are happy to turn down AA, then you can get playable frames but with some games and everything set to the highest, the 295×2 or the Titan Z will not cope.
i still can can 2xR9 295X,
i still can can 2xR9 295X, crossfire, for the price of that TitanZ and it will cope just fine around 80fps everything maxed out, on a 4k monitor at 60Hz.
most cases offer back and bottom 120/140 fan slots.
i dont know if the cooler cable is long enough to crossfire 2 of them at the 240 slot at the top of cases.
but anyhow this card is still the best for 4k to me
Well this is quite
Well this is quite contentious, as you will have the grunt to run 4K with 2*295×2’s but will you have the VRAM?
http://forums.overclockers.co.uk/showthread.php?t=18592360
Over 4.3GB being used in 3 of the 6 games tested at 4K.
so according to your point of
so according to your point of view, the R9 295X is a no brainer 4k choice compared to 780Ti Sli with 3Go ?
thx for pointing that out
Not really no. The 295X is a
Not really no. The 295X is a fantastic card for those that have limited room but the wiser choice would be a pair of 290’s and save a ton of cash but for those that seriously want 4K maxed settings… You will need a whole lot more than 4GB per GPU of VRAM. 6GB 780 perhaps?
well doesnt really save alot
well doesnt really save alot of cash, crossfire 290x for 1200$, thats trading 300$ for noise, cold temps, single pcb, and overall better perf according to bunchmarks.
or sli 780Ti for 1400$, here also not much of a saving, 100$ for same noise, lower temps and also single pcb, and better perf especialy on 4k.
but if like you picking up a card for 4k is according to memory size, there is also shapphire VaporX R9 290X 8Go, cheaper than 780 6Go, and better perf.
seriously no matter how i put it, i still see more value On AMD cards, beh maybe thats because i am a fan boy i guess.
Are we talking about the same
Are we talking about the same thing here?
I’m looking at Metro at 4K and this card doesn’t dip below 30 FPS. For a single player experience that is quite playable. It’s not fantastic, but playable.
Are you aiming for a constant 60 or 120 FPS?
i dont see any 4k monitors
i dont see any 4k monitors with 120hz refresh, so yes aiming for 60-80fps average, is obtainable with crossfire 295x, which twice as fast as TitanZ in theory for the same price, just quieter, colder, 2pci slots instead of 3.
and still better than sli 780ti with 3Go that are roughly at the same price.
and with Mantle perk for couple coming years untill DX12 is out, honestly by far this is the most adequate solution for 4k, a bit expensinve, but still i can see the value for the money that i can somehow justify, not like if i was droping in an extra 1500$ out of thin air like TitanZ( which i still see titan more adequate pricing around 700$ not 1000$, and titanZ 1500$ not freaking 3000$)
Does it come with an
Does it come with an electrician to upgrade your in-house wiring to 240v ?
yes they install a mini
yes they install a mini nuclear power plant in your basement, just incase you decide to overclock it.
more serious not that many gamers care about power consumption, as long as it’s managable.
anyone who is gonna sli 780ti or crossfire 290x, will get something from 850-1050-1200watt Psu, so whatever you take you are still gonna have extra free power not used at all, like you will find ppl with Single Gpu and using with a 300 draw using 600-650 Psu
let’s make a gameguys, tell me your PSU and what config are you running, and let’s see how much wasted power everyone has..
Would this PSU be good enough
Would this PSU be good enough to run this beast of a video card?
http://www.superbiiz.com/detail.php?name=PS-800GAD3&c=CJ
logicaly 850 would do
logicaly 850 would do depanding on what ppl have installed, and if OC cpu and such, 1000watts would be more adequate i think.
beside AMD will post minimum PSu requirements when it’s out.
probably 1000watts single, and 1500watts crossfire.
if a complete system with a
if a complete system with a gtx 780 can be run stable on a single Corsair RM450, which is possible, i do not think you need 1000watt for a single one of these. Im currently housing a Corsair 860 watt plat certified psu and im quite possitive that it will be more than enough for the entire system if i got one of these, spite OCing my i7 haswell to 4,5ghz.
But then again a PSU should be the last thing you save money on.
“Anyone that debates the
“Anyone that debates the value of the storm created by PC Perspective and other sites about smooth, low variance frame times needs to be taken off the Internet.”
LMAO…