You just never know what's going to come your way on Facebook on a Friday night. Take this evening for me: there I was sitting on the laptop minding my own business when up pops a notification about new messages to the PC Perspective page of FB. Anonymous user asks very simply "do you want pictures of skylake and r9 fury x".
With a smirk, knowing that I am going to be Rick-rolled in some capacity, I reply, "sure".
Well, that's a lot more than I was expecting! For the first time that I can see we are getting the entire view of the upcoming AMD Fury X graphics card, with the water cooler installed. The self-contained water cooler that will keep the Fiji GPU and its HBM memory at reliable temperatures looks to be quite robust. Morry, one of our experts in the water cooling fields, is guessing the radiator thickness to be around 45mm, but that's just a guess based on the images we have here. I like how the fan is in-set into the cooler design so that the total package looks more svelte than it might actually be.
The tubing for the liquid transfer between the GPU block and the rad is braided pretty heavily which should protect it from cuts and wear as well as help reduce evaporation. The card is definitely shorter compared to other flagship graphics cards and that allows AMD to output the tubing through the back of the card rather than out the top. This should help in smaller cases where users want to integrate multi-GPU configurations.
This shot shows the front of the card and details the display outputs: 3x DisplayPort and 1x HDMI.
Finally, and maybe most importantly, we can see that Fiji / Fury X will indeed require a pair of 8-pin power connections. That allows the card to draw as much as 375 total watts but that doesn't mean that will be the TDP of the card when it ships.
Also, for what it's worth, this source did identify himself to me and I have no reason to believe these are bogus. And the name is confirmed: AMD Radeon Fury X.
Overall, I like the design that AMD has gone with for this new flagship offering. It's unique, will stand out from the normal cards on the market and that alone will help get users attention, which is what AMD needs to make a splash with Fiji. I know that many people will lament the fact that Fury X requires a water cooler to stay competitive, and that it might restrict installation in some chassis (if you already have a CPU water cooler, for example), but I think ultra-high-end enthusiasts looking at $600+ GPUs will be just fine with the configuration.
There you have it – AMD's Fury X graphics card is nearly here!
Integrating the card in an
Integrating the card in an existing water cooling system would require either an aftermarket block made by EK, etc, or Gerry rigging the existing block into your system. I’m not sure that would be sufficient cooling, especially if the present cooling is somekind of pressurized system using a particular type of coolant.
Why would the system be
Why would the system be pressurized?
Hmm….this might very well
Hmm….this might very well end up being my next gpu.
I will state that I’ve been
I will state that I’ve been trying to avoid news about the upcoming Fiji for fear that they hype would be overblown, and the criticism equally excessive.
But I have serious concerns if the card is watercooled only. That implies heat issues, and then there’s the difficulty of finding space on the case for potentially two radiators (1 CPU, 1 GPU). Compound that with 2x 8pin connectors…
D@MM!T. I was hoping this would be the first in a series of successive “wins.” Instead – by nature of its design – AMD has already limited the potential buyers to the top 2% of the gaming community.
Would you say that you are
Would you say that you are feeling fear, uncertainty, and/or doubt?
Yes, but that’s completely
Yes, but that’s completely unrelated to this topic…and how did you know?
Exactly my thought. The
Exactly my thought. The energy that you draw for these things has little option but to be converted into heat. This is just basic physics. This is a performance move, not a power efficiency one- unless they are after charging you more for coolers. So you will likely have to deal with yet another AMD record for power draw.
390x series and below are
390x series and below are rehashes from 200 series
If the Fury X has a 300w TDP or more, the card’s size prohibits the use of large air coolers. Now if AMD allows custom pcb designs with card makers then they will enlarge it too allow bigger air coolers.
Also The Fury X will only come with 4gb when it releases and all the leaks/rumors suggest that the gpu’s performance isn’t any better 980ti or Titan x, means that with a $600 price tag and only 4gb of vram, and a possible TDP of upto 375w, makes the Fury X garbage option when you have a GPU(980ti) that has 2gb more vram and is upto 40% more efficient per watt.
Nvidia’s efficiency is
Nvidia’s efficiency is overblown. In reality it is much closer to AMD’s cards.
Nvidia’s efficiency per watt
Nvidia’s efficiency per watt Its not overblown, do you actually know how took at benchs?
Look at 290x uses 295w at peak vs say GTX 970 using 180w, performance of these cards are on par with each other overall. There is a 40% difference in power usage between them.
They are not on the exact
They are not on the exact same process even though they are both still 28 nm. I have seen things indicating that some, if not most, of Nvidia’s Lower power consumption is due to the difference in process tech, not anything inherent to the design. AMD will be using a different process for upcumming parts.
I don’t think its cause diff
I don’t think its cause diff process tech. Nvidia was showing maxwells power improvements back with 750ti.
It is hard to find specific
It is hard to find specific information, but I have seen some claims that Nvidia is using a different variant, or at least a slightly customized process. TSMC offers several different 28 nm variants.
Anyway, I suspect a lot of te power savings from Nvidia actually came from cutting out FP64 hardware. The 290x still has FP64 at 1/8th FP32. The Titan X only has FP64 at 1/32 FP32. This is probably fine for gaming, but it means they probably will not be selling it as a compute card. It is foolish to think that AMD would not do the same thing to save power and die area with their completely new design. FP64 hardware takes up a huge amount of die area, and I am not sure it’s reasonable to try to power gate it off when not in use. That would cause significant leakage for the 290x and variants. Fiji should be significantly more power efficient for FP32, but FP64 will probably be reduced to similar levels as Nvidia 980/980Ti/Titan X.
Typical garbage from nvidiot
Typical garbage from nvidiot shills. GM204 != 165W. 185W for stock is min power draw if you run a board on min voltage. Then it won’t run boost. Most AIB cards are >200W. AIB 290X draws 250W. People wonder why there’s turn-off on bought-paid for “HW review sites”. Nvidia ditched compute flexibility in Maxwell, so they perform well in current game workloads. How well will they do under Mantle (er DX12…) with simplified command processor & 2 threads will be apparent soon, I guess.
Mantle you mean that
Mantle you mean that proprietary API AMD was working on that was closed source so no one else could use it? Same BS everyone attacks nvidia over? No it never was open source as AMD claimed it would be.
Stop calling people shills. AMD has taken heat from every site rightfully so. AMD has said and claimed so many things over the years to end up NOT living up to their claim’s. Instead of being an “Another Moron Deciple” (if you want to go down name calling route).
Yes, that close thing that it
Yes, that close thing that it is so closed it is now called Vulkan and thanks to it you have now DX12 less that 2 months away.
PS Deciple? At least you know how to spell Moron, moron.
welcome to auto correct
welcome to auto correct smartass maybe you heard of it? tends to be wrong at times.
LOL. Don’t cry.
LOL. Don’t cry.
I’ve never seen an
I’ve never seen an autocorrect replace a misspelled word with another very badly misspelled word, except when the replacement misspelled word was intentionally added to the autocorrect dictionary.
By the way, the word you’re looking for is “disciple”.
That’s the most foolish thing
That’s the most foolish thing I’ve read.
So how is working PR at nvidia?
The thing I hate about these
The thing I hate about these types of designs, especially with a Radiator that thick, is that you have to find special cases if you even want think about Crossfiring it. Its not like you can find any old mATX Case and throw two in (Titan X). You just can’t do that when your radiators look pretty thick or you have two radiators to mount, plus trying to mount one for your CPU.
IMHO, watercooled graphics shouldn’t be released from AMD or Nvidia, but by select AIBs.
If you are building a custom
If you are building a custom system, then it is your responsibility to get components that work together. If you can’t do that, then you should buy something from a company that does it for you.
Yes, yes, yes, but think
Yes, yes, yes, but think about it from a business standpoint for a moment. Tell me what sells your card better. The ability right off the bat for any case to CF or only selective CF?
I just hope the air cooled
I just hope the air cooled version is actually cheaper, but with everything else the same as the water cooled version. That way, I can custom water cool it without much extra cost.
Off with the plate! I wanna
Off with the plate! I wanna see what´s inside.
POS space heater anyone?
POS space heater anyone?
Brain-dead Nvidia marketing
Brain-dead Nvidia marketing sock-puppet, anyone?
I am kind of wondering if the
I am kind of wondering if the black top cover is replaceable for different looks. If you are water cooling the whole thing, then you can put any cover on it you want.
They’re re-using the “Fury”
They’re re-using the “Fury” moniker from a 1999 AGP video card:
http://img19.imageshack.us/img19/2807/uvnn.jpg
That picture was posted on
That picture was posted on pcper several days ago. I don’t really like “Fury” or “Fury X”. Perhaps it should have been “Radeon Serious Edition” and “Radeon Super Serious Edition”. It is what it is regardless of what they call it.
That picture was posted on
That picture was posted on pcper several days ago. I don’t really like “Fury” or “Fury X”. Perhaps it should have been “Radeon Serious Edition” and “Radeon Super Serious Edition”. It is what it is regardless of what they call it.
I am wondering what the dual
I am wondering what the dual gpu card will look like. Wild speculation, but is it possible that this is the dual gpu card? The radiator design is different. If the pump is in the radiator, then it may be plausible to stack 2 PCBs in there. Is would expect more than 2 8-pin connectors for a dual gpu though.
Here’s to hoping for the best
Here’s to hoping for the best and fucking the rest. Please don’t let us down again.
Hey Ryan, FCAT says hello…
Hey Ryan, FCAT says hello… Funny how things drop out of favour (Unigne Heaven anyone?) when it doesn’t suit Team Green anymore. You calling for “balanced coverage” is a joke. (https://twitter.com/ryanshrout/status/609437983134089216)
Reap what you sow…
FCAT only has its use in
FCAT only has its use in CF/SLI setup. Shown if a frame was skipped/dropped cause taking to long. I am sure Ryan could still use it if they wanted to but probably won’t yield any results worth the time do it.
If review sites don’t point out bs then it would end up being the buyers that suffer from seeing it after they buy the product. Reason nvidia doesn’t get flakk for much is well they don’t do anything hardware wise to get any flakk for. When they say their card or hardware works in such and such way. It pretty much works in that way. They don’t have a bunch of idiot PR marketing guys putting their foot in their mouth making stupid claims every time they open their mouth. (if you try to point out gtx970 thing don’t, that is 1 thing vs like 20 things AMD has claimed and been wrong)
I literally use FCAT / Frame
I literally use FCAT / Frame Rating in every single GPU review I do. And I think I’m the only outlet to do so. Lol
Hello Ryan,
on the topic of
Hello Ryan,
on the topic of FCAT / Frame Rating:
did you ever planned for a review of “lucid virtu MVP” with FCAT testing ?
a couple of years ago, before the FCAT methods, this lucid virtu tech was included as a selling point in some motherboards … if i remember correctly, it was supposed to use the IGP to ‘help’ the GPU output more frames or better in sync with the monitor
at that time, this “virtual vsync” could not really be tested because FPS-reporting was showing the virtual frames instead of the real ones … so no objectif conclusions have been made about this tech (or at least, i haven’t seen any)
with all this new testing methods, i wonder what results might show up when putting lucid virtu tech on an FCAT bench
the tech is probably no longer relevant, if it has ever been, but it is still very actual (with some imagination):
– help with overhead (similar to DX12/vulcan ?)
– some FPS/sync magic (similar to gsync/freesync ?)
i would love to see this topic revisited with current testing methods … if only to make me stop wondering about it 🙂
kind regards
The cover on the front of the
The cover on the front of the card looks weird still. I would expect it say “Radeon Fury” at a minimum. Perhaps this is still some pre-release card and the actual shipping product will look different. Is also have to wonder if all of these leaks are just planned marketing.
Now that would be funny
Now that would be funny marketing, handing Ryan a set of spy shots, but not sampling him a card.
Chances are they simply hadn’t settled on the final branding when locking down the physical design.
If Ryan has some of these
If Ryan has some of these cards for testing, then he is certainly under NDA and cannot post any of his own pictures until the embargo expires. If someone else leaks them, then he can post them without violating his NDA. If AMD leaked the photos themselves, then it is just part of some marketing plans.
The weird peg-board like cover on the front of the card still looks strange. I am still wondering if there is a surprise underneath. Rumors have indicated that we may get water cooled and air cooled versions of this card with different clock speeds. Also, I would expect a dual card rather quickly since the interposer is so small and PCB is very simple. The PCB only has power delivery, PCI-e, and video outputs. AMD may want to show off the small size of the water cooled version to emphasize how different their interposer design is. I expect an air cooled version will need to be significantly larger due to the need of a large air cooler. We haven’t seen any leaks of these possible other versions.
It looks like there could be
It looks like there could be room for FURY along the side also. There is some black covering over it, but there is some space between the “RADEON” logo and the two 8-pin connectors. This makes me think of how we often see shots of new car designs being tested with covers or other things obscuring the actual look of the car.
People who are complaining
People who are complaining about 4gb. from now on directx 11 will probably begone to directx 12. Directx 12 will be able to use in crossfire all the memory from the two or more cards to do diferent processes. The bandwidth of this new memorys will probably compensate for the lack of more Gigabytes, it is much faster, most of the scenes need more memory to reduce the latency , they buffer just waiting to be called, in this case we will be much less poping of textures in games.
For video editing it will be a problem although.
Sapphire
https://www1.sapphir
Sapphire
https://www1.sapphiretech.com/productdetial.asp?pid=D40475DB-8BD0-40F6-8C33-F12D63272AEC&lang=eng
ALL models online except Fury / Fury X with the two Fury cards confirmed on Sapphire’s page but pages are blank.
At what point do we consider
At what point do we consider AMD just thumbing their collective noses at Scott by not releasing a 380X?
this source did identify
this source did identify himself to me and I have no reason to believe these are bogus. And the name is confirmed: AMD Radeon Fury X.
{asus memo pad 7 charger|original asus memo pad 7 charger}
{lg g pad 7.0 charger|lg gpad 7.0 charger|g pad 7.0 charger}
this source did identify
this source did identify himself to me and I have no reason to believe these are bogus. And the name is confirmed: AMD Radeon Fury X.
{asus memo pad 7 charger|original asus memo pad 7 charger}
{lg g pad 7.0 charger|lg gpad 7.0 charger|g pad 7.0 charger}
I will wait for some reviews
I will wait for some reviews of the performance and cards with waterblocks for custom water loops.
However, looking at the pictures I am worried about how AMD propose to fit the radiator within a case. With the coolant tubes coming out back of card the logical place would be the front of the case not sure if tubes are long enough to fit to standard fan position on rear of case (and size of radiator could be a problem there as well). I can see problems with fitting at front of case, either you have the fan drawing air into case (nicely warmed up by radiator – not ideal) or you need to have front of case as air outlet (works well as a space heater!)
Looks like 40-50 cm of tubing
Looks like 40-50 cm of tubing to me, which should fit in many a case, and if you’re looking at the dimensions of the radiator, they seem to have engineered it so you can put multiples side-by-side in places that normally allow for a single 240 mm rad. (depth of the case permitting, of course)
And it’ll be equally good (or bad) a space heater regardless of where in the case you’ll end up putting the rad, as the same amount of energy gets dumped into the room either way.
Martin: the tubing looks to
Martin: the tubing looks to be roughly 2x length of card, so real question is how long is the card? If mini-itx size that would be very nice.
I do not think the radiator is a clever design, just clever photography – no pictures showing the barbs, I suspect it is just a standard 120 mm thick radiator, so idea of getting 2 in space of a 240mm radiator is probably a no-go