You just never know what's going to come your way on Facebook on a Friday night. Take this evening for me: there I was sitting on the laptop minding my own business when up pops a notification about new messages to the PC Perspective page of FB. Anonymous user asks very simply "do you want pictures of skylake and r9 fury x".
With a smirk, knowing that I am going to be Rick-rolled in some capacity, I reply, "sure".
Well, that's a lot more than I was expecting! For the first time that I can see we are getting the entire view of the upcoming AMD Fury X graphics card, with the water cooler installed. The self-contained water cooler that will keep the Fiji GPU and its HBM memory at reliable temperatures looks to be quite robust. Morry, one of our experts in the water cooling fields, is guessing the radiator thickness to be around 45mm, but that's just a guess based on the images we have here. I like how the fan is in-set into the cooler design so that the total package looks more svelte than it might actually be.
The tubing for the liquid transfer between the GPU block and the rad is braided pretty heavily which should protect it from cuts and wear as well as help reduce evaporation. The card is definitely shorter compared to other flagship graphics cards and that allows AMD to output the tubing through the back of the card rather than out the top. This should help in smaller cases where users want to integrate multi-GPU configurations.
This shot shows the front of the card and details the display outputs: 3x DisplayPort and 1x HDMI.
Finally, and maybe most importantly, we can see that Fiji / Fury X will indeed require a pair of 8-pin power connections. That allows the card to draw as much as 375 total watts but that doesn't mean that will be the TDP of the card when it ships.
Also, for what it's worth, this source did identify himself to me and I have no reason to believe these are bogus. And the name is confirmed: AMD Radeon Fury X.
Overall, I like the design that AMD has gone with for this new flagship offering. It's unique, will stand out from the normal cards on the market and that alone will help get users attention, which is what AMD needs to make a splash with Fiji. I know that many people will lament the fact that Fury X requires a water cooler to stay competitive, and that it might restrict installation in some chassis (if you already have a CPU water cooler, for example), but I think ultra-high-end enthusiasts looking at $600+ GPUs will be just fine with the configuration.
There you have it – AMD's Fury X graphics card is nearly here!
That fan looks very
That fan looks very interesting indeed…
It looks exactly like Nidec
It looks exactly like Nidec Gentle Typhoons, it even has the little support ring from the high-RPM typhoons. An excellent fan with enterprise build quality and a price to match, but as far as I know, Nidec hasn’t produced them in awhile. Looks like AMD isn’t messing around.
because when I think video
because when I think video card, I think $30 case fan attached to a radiator.
I’m not really sure what’s
I’m not really sure what’s the problem with it. It’s a great fan whether on a case or on a radiator, and if using a $20 fan instead of a $5 fan makes the card cooler and/or quieter, then why not? Should last awhile as well.
Bought 2 OEM GT1850 only
Bought 2 OEM GT1850 only $14*2.For amd,maybe $5~$6 each.
because when I think video
because when I think video card, I think $30 case fan attached to a radiator.
I’ve read that the fans just
I’ve read that the fans just aren’t sold to consumers directly. If you have a corporate account the are still available to order.
They are being made. Check
They are being made. Check with Cooler guys to buy the OEM version.
Defiantly isn’t a GT.
Defiantly isn’t a GT.
definitely*
Thanks auto
definitely*
Thanks auto correct.
Nidec has always produced
Nidec has always produced them 😉 they are the Gentle Typhoon High Speeds, its just that they are no longer sold in retail since Scythe pulled the contract. You can still buy all the Gentle Typhoon range from Nidec if you order a few thousand posing as a OEM
That fan is a Cooler Master
That fan is a Cooler Master Silencio FP 120 that is on the Nepton 120XL, 240M and Siolencio 652s Chassis, which is a high static fan.
or maybe not… 🙂
or maybe not… 🙂
Here’s my Gentle Typhoon AP29
Here’s my Gentle Typhoon AP29 for comparison:
http://www.overclock.net/t/1560118/pcper-amd-radeon-fury-x-graphics-card-pictured-full-on-pictures-uses-2-x-8-pin-power/0_50#post_24031727
No, it is a high speed Scythe
No, it is a high speed Scythe Gentle Typhoon (http://www.scythe-eu.com/en/products/fans/gentle-typhoon-120-mm-high-rpm.html). Look at the support ring and at least one photo of Fury X with low speed Typhoon exist (http://www.guru3d.com/news-story/amd-radeon-fury-x-photos-appear-online.html).
So it will be at least 3000 rpm fan.
I’ll just leave this link
I’ll just leave this link here from the AMD Announcement from E3 2015
http://cdn.videocardz.com/1/2015/06/AMD-Radeon-Fury-radiator.jpg
Sure looks like the final will have a Cooler Master Silencio FP 120 😉
1st pic. Why is his shoe
1st pic. Why is his shoe blocked out ? Did he have a hole in it exposing his toes ?
3rd pic. I’m guessing the leakers name, Barry Allen ? Explains hole in shoe.
Shoes are used to identify
Shoes are used to identify suspects in crimes. All shoes are very unique and can expose a person, especially the soles of the feet and the print they leave…
This guy blocked his shoes so they couldn’t ID the leaker. He could lose his job instantly for opening one of the boxes.
No VGA port? How will I use
No VGA port? How will I use this with my CRT? Active adapters don’t even support the full res and refresh rate of a good CRT
Nobody cares. VGA is dead,
Nobody cares. VGA is dead, CRT is dead, get with the times. Only a very small minority of customers for this card would be using a CRT.
And only a very small
And only a very small minority purchase cards like these. Whats your point?
My point is that a small
My point is that a small minority of a small minority is a very small number of people. There’s basically no reason for AMD to put VGA on this card, because most customers are going to use digital interfaces, mainly DP.
And very few people truly
And very few people truly care about graphics and displays (just look at how many people run rubbish 1366×768 TN panels with integrated graphics). This card is aimed at the few who do.
I would like to get with the times, but first, monitor manufacturers need to give me something worth buying. After 15 years, there are still no monitors that are clearly better than the GDM-FW900. Everything involves significant compromises somewhere.
I could understand no VGA if they made this card single slot, but as it is, AMD has an entire slot utterly wasted. It isn’t even being used for exhaust. Nvidia manages to have VGA support even on air-cooled cards that benefit from exhaust ports.
And what would need to be
And what would need to be exhausted from there? All the cooling is being done by the radiator, and it’s not as if those holes in the bracket are gonna make any actual difference anyway….
Contemporary GPUs already
Contemporary GPUs already don’t support D-SUB over their DVI ports, they support digital only, so this is not something you can bash the Fury for.
That was only really the
That was only really the 290(X). The more recent AMD 285 and all current Nvidia cards still support DVI-I.
also no S-VIDEO/Composite
also no S-VIDEO/Composite out… very disappointing
Crazy how small that looks,
Crazy how small that looks, just glancing at my GTX770. Looks like it might be time for an upgrade.
My body is ready.
My body is ready.
Shit, does this mean you
Shit, does this mean you don’t have one in house for testing? I was hoping to see a review the day of the release and figured you were under an NDA. Could it be that sharing these photos is somehow not precluded under the NDA, which is what I am hoping for?
Fuck Yeah!!!!!!
Single slot
Fuck Yeah!!!!!!
Single slot water cooled card for custom loops.
I love how AMD always have single rowed I/O for flagship cards. You can always get a single slot bracket.
Hell Yeah!!!!
A tiny graphics card with a
A tiny graphics card with a giant radiator is a ridiculous product.
Traditionally card sizes are
Traditionally card sizes are determined by heat output, more powerful GPU generate more heat, hence bigger cooler to cool down the chip. Bigger cooler lead to a bigger overall card at the end.
Another factor is memory chips. High end cards usually have more memory, hence needing more memory chips to be contained by the PCB. Hence they usually have a bigger PCB.
Those two factors make high end cards HUGE.
But Fury doesn’t have this problem, all its memory are mounted directly on top of the GPU, making the PCB smaller. Also having a standalone radiator eliminate the need for a big onboard cooler.
That’s why it’s so small.
Get your facts straight before complaining.
I don’t think the cooler size
I don’t think the cooler size dictates the graphics card size much; the cards are usually crowwded with components. High-end cards usually have a lot of memory chips which take up a lot of space, as you said. The other components that consume a lot of board space is the power regulation circuitry. The extra power connectors supply 12 V. This needs to be converted to the low voltage/high current required by the gpu. The VRMs needed to supply ~300 W will take a lot of space. This is what will be on the Fiji board. Nit will be the interposer package and the Power delivery circuitry.
I have the facts on this
I have the facts on this card:
4GB of RAM on a flagship card while the competetion has 6-12
Small card with a massive water loop, making it take up more space than conventional vapor chambers
2×8 pin power meaning it probably uses plenty of power
This is the new 7970. GP100 is the next GK110. AMD has 3D RAM first. Intel(Knights Landing) and Nvidia arent rushing to release it until its ready.
Having a tiny PCB with a 70x70mm chip that requires a huge radiator is kind of pointless, like i said. Get your engineering and design right.
1. Vram is all there is to a
1. Vram is all there is to a video card I take it?
2. And also more silent and better cooled. Almost any case has room for a single 120mm radiator, especially the cases bought by the audience for this product.
3. So what? This isn’t a laptop. You’re buying a top of the line GPU, I’m not assuming you can’t pay your electricity bills
4. The 7970 was a great card, so what’s your point?
5. Thanks for sharing your knowledge, oh great and knowledgeable engineer with years of experience in the field and who obviously knows more than the people who’ve been working on it.
/rant
First, VRAM hasnt been used
First, VRAM hasnt been used on GPUs for about 15 years. Its GDDR or HBM in this case.
And no its not all there is to it. I never said it was.
My point about the 7970 is that it was a big chip with a 384 bit bus like GK110, but when GK110 came out a few months later it was sigbificantly more powerful. I think something similar will happen here: AMD did 3D RAM first,Nvidia will do it better.
Youre welcome.
Its been stated that Nvidia
Its been stated that Nvidia will be using HBM from SK Hynix, the very memory modules that AMD co-designed. I guess HMC is off the cards for nvidia at least for the time being.
You’re welcome.
p.s. There’s being a fanboy, being smug and then being both at the same time. All over some logic gates… you go you!
I was replying to someone
I was replying to someone being smug, so i dont care about being polite.
What was i wrong about exactly?
Laughable really. Compare
Laughable really. Compare that to the beautiful GTX TITAN.
Would that be the same card
Would that be the same card that thermally throttles and has a loud cooler?
When it was AMD with the card
When it was AMD with the card that thermally throttled while using a really loud reference cooler, Nvidia fanboys laughed and mocked.
But now that it’s Nvidia’s card that throttles under a loud reference cooler, it’s perfectly okay.
Yea for sure, but not half as
Yea for sure, but not half as ridiculous as your comment.
Where are the skylake pics
Where are the skylake pics then?
Where is the Skylake part?
Where is the Skylake part?
The Skylake image(s) I have
The Skylake image(s) I have didn't turn out to be nearly as exciting… Just a bare processor.
Bad thing about 2x8pin, for
Bad thing about 2x8pin, for AMD that could mean cards power draw can between 300watts and 600watts. Case in point r9 295×2 which has 2x8pin as well.
Might need to check you
Might need to check you figures again boss. (1×75W + 2×150W)
He is an known Nvidia fanboy,
He is an known Nvidia fanboy, he doesn’t have to check anything. He just throws mud.
Its called Knowing my
Its called Knowing my hardware smartass. You call me an nvidia fanboy, but its sad I know more about your amd hardware then your amd fanboy ass does.
What is sad is your a Nvidia
What is sad is your a Nvidia fanboy that trolls multiple forums with the exact same post.
Logic and Truth are on my
Logic and Truth are on my side, what you have? AMD Marketing?
Logically speaking, 375 is
Logically speaking, 375 is between 300 and 600, so your basic math is correct. Good job.
They may be an Nvidia
They may be an Nvidia employee. I have worked at tech companies where some of the sales guys admitted to getting on forums and stirring up FUD about the competition in their spare time. It isn’t just fanboys with some misguided sense of brand loyalty.
as i pointed out if you could
as i pointed out if you could read, AMD has used 2x8pin to power a 600watt GPU. SO just cause spec says 150watts, AMD has pulled a lot more from it.
If you can Count, you see 2x8pins on that card. That card draws least 500watts under load, and can draw as much as 600watts under max load.
http://images.bit-tech.net/content_images/2014/04/amd-radeon-r9-295×2-review/r9295x2-7b.jpg
That is a dual gpu card which
That is a dual gpu card which makes your point completely irrelevant.
Nah, he’s actually correct in
Nah, he’s actually correct in pointing it out. AMD has ignored PCIe power specs before, so taking the connectors on the board and coming up with max power from that doesn’t work for them anymore.
BS. He is trying to imply
BS. He is trying to imply that Fiji could take up to 600 W which is ridiculous FUD. It is pretty obvious to me that there is no way that this single gpu is going to consume anywhere near the amount of power that the 295×2 can draw. I would not be surprised if Fiji gets close to 375 though. The Titan X is a 250 W card with 3072 shaders. Fiji is supposed to have 4096 shaders. With 1024 more shaders, it isn’t going to be lower power than the Titan X. It also looks like it has a lot more TMUs, ROPs and other hardware than the Titan X. I would say most gamers will not care about 250 W vs. 375 W if the performance is there.
your the BSer here.. Fiji CAN
your the BSer here.. Fiji CAN take up to 600w, and there’s been plenty of single GPU cards in the past that draw 450-500+ watts when overclocked and running on a “we don’t give a shit about power consumption or PCI-E standards” mentality like AMD does. Fiji will do more than get “close” to 375 watts, that will be it’s BASE TDP, as in, once overclocked and put under load it’ll hit MORE than 375w. Even if it’s TDP is only 300, which would be physically impossible for AMD to do as their stream processors are nearly 50% less efficient than a cuda core, and powering 4096 of those inneficient things will easily draw 325-350 watt base tdp at least, with 375 being a more fair number. And don’t forget this is the reference board, custom AIB designs with more power phases etc.. like a vapor-x or an XFX Double D etc.. will be using over 400w in some cases. Plus you have to factor in the wattage of an AIO liquid cooler into the TDP of the card as well, so that’s another few dozen watts added in under stressful load.
This card could easily hit 450-500w when overclocked under full load. You AMD fanboys seem to forget that TDP != load wattage, TDP is simply the baseline reading for a moderate load on the card. For example, the GTX 970 has a TDP of ~160w, but when you put it under full load, even with no overclock you hig 250-300w; when OC’ed that goes up to 325-350ish. Whereas the TITAN X/980 TI can hit ~400w under OC and heavy load. So even if the Fury X hits 500-600w with max load it’s not any different than AMDs past TDP designs.
you’re*
you’re*
Prove it. I have seen no
Prove it. I have seen no evidence that nvidia’s design is actually inherently lower power. The Titan X is not on the same process tech as the 290x was so they are not directly comparable. Current rumors say around 300 W with the new process for Fiji.. If you are overclocking, then you obviously are not staying in the preset limits by definition.
Theyre not both TSMC 28nm?
Theyre not both TSMC 28nm?
I have read that Nvidia did
I have read that Nvidia did work with TSMC to optimize the process. I am not sure about the specifics; the information is hard to find. I have seen some indications that the 290x uses 28HPM while the 980 uses the 28HPC process variant. I believe there are 4 or 5 different 28 nm process variants available from TSMC, but not all of them would be suitable to building a gpu.
To be honest that works out
To be honest that works out at about £30 a year for 25 hours a week. If the difference in price was say £200 then you would be looking at 10 years to recoup the difference, hardly seem financially viable thing to do?
2X8pins = 600W
Logic is in
2X8pins = 600W
Logic is in your side LOLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLL!
Haha yeah that guy really
Haha yeah that guy really tries to look smart but the stupidity is just stronger
LOL So much for accurrate
LOL So much for accurrate measurements! Tom had it running for much less http://www.tomshardware.com/reviews/graphics-card-power-supply-balance,3979-5.html
500W PSU http://www.bequiet.com/en/powersupply/527
I have seen the R9 295X2 with
I have seen the R9 295X2 with 2x 8-pin power connectors hit over 425 watts of power draw, measured directly at the GPU: https://pcper.com/image/view/54167?return=node%2F62575
But nothing like 600 watts.
Well, if you want 4096 shader
Well, if you want 4096 shader cores (compared to 3072 on the Titan X), your going to have to be willing to deal with the power requirements and heat output to run them. Comparing this card to the 295×2 is ridiculous though. The 295×2 has two large GPUs (2816 shaders each) plus 4 GB of GDDR5 each. There is no way that a single Fiji board will consume that much power. I doubt a single Fiji board will go over the 375 W limit. I suspect the water cooler is needed because of the small size of the card. The Titan X is gigantic so it has a lot of space to get the heat out into the air. This card looks like it might be half the size. It would be hard to get an air cooler in such a small space that could dissipate even 250 W. If you mounted a blower the size of the one on the Titan X, the blower would take up most of the space and only leave a very small area for heat exchange.
I wouldn’t compare cores from
I wouldn’t compare cores from different architectures.
The water cooler probably is a decision after seeing what happened with Hawaii. Hawaii was killing Nvidia cards in performance, but the GREEN/D NVIDIA press found the excuse they needed to direct reader’s attention to the cooler. In a Hi End card’s review this was THE FIRST TIME that performance was something secondary. Completely and utterly ridiculous.
The two standards GREEN/D NVIDIA press is using the last two years is probably the reason why AMD choose to show the card first to the public and then give it to reviewers. Reviewers that are crying foul now all over the internet because we are talking about AMD. When we are talking about Nvidia fake specs are not important and miscommunication FOR 6 MONTHS in a company like Nvidia is completely logical and expected and justified.
I think the water cooler is
I think the water cooler is necessary because of the size of the card. It will be interesting to see comparison pictures with this next to a Titan X or 980 Ti. If you used a blower like Nvidia does, it would take up most of the space on the card. There would be very little space for the actual cooling fins in a two slot card. How big is a CPU cooler that can dissipate ~300 W? The 980 Ti is a huge card. If you look at it without the shroud, the blower takes up about half the card with the other half being the heat exchanger cooling fins. An air cooler simply would not have worked on this small of card. They might be able to do it by extending the blower beyond the end of the card and only have cooling fins on top of the card. At this price range though, water cooling makes a lot of sense. It is quieter, more flexible for case size, and it cools better.
Yes. But AMD could easily
Yes. But AMD could easily show an air cooled one as a reference card, and yes by extending the cooler.
AMD has changed the way they look at their reference cards. In the past they where giving a just OK version, so that they leave their partners room to create something that not only will look, but also will be and perform much better. That way their partners could charge a little extra and have better margins. That was typical in graphics cards for ages.
I believe, after seeing how the press used the cooling system’s mediocre performance, as a way to make the best performing card look inferior to other cards, they changed their attitude, something that we already saw with 295X2. AMD now makes a top card as a reference card, not just an OK card. That probably limits profit margins for their partners, but it is preferred to have lower profit margins from something that you can sell than having bigger profit margins from something that got a very negative response from the press.
PS. Did you noticed how those two 8pin connectors make it easily on the tittle? You have a new small form factor, water cooled hi end card that it is meant to go into cases with PSUs with many hundred of watts if not 1000+, and what is the most importand thing on the world? Those two 8pin connectors. There are GTX980 cards with two 8pin connectors to offer more power and at the same time more room for overclocking, but here we treat those two 8pin connectors as something negative, like we are talking about a mid range card.
It’s in the title because
It's in the title because it's the only NEW information really provided in these photos.
But you are correct, I hope, that AMD has been rethinking its reference designs and is building something higher quality and with more care than they did on the 290/290X launch.
10 days ago we had this
10 days ago we had this picture that shows 2X8pin connectors
http://cdn.videocardz.com/1/2015/06/AMD-R9-FURY-vs-GTX-980-Ti-PCB-comparison-900×752.jpg
The next was posted at least 3 days ago and shows the two 8pin connectors clearly.
http://i.imgur.com/OI8HOfA.jpg
Anyway, the fact that you are a hardware site doesn’t mean that you spent 24/7 googling about everything, so probably you haven’t seen those.
Waiting for your review. I know it will be good written, I hope it will also be fair.
One last thing
https://pcper.com/news/Graphics-Cards/CES-2015-EVGA-Shows-Two-New-GTX-980-Cards
EVGA Classified Kingpin
2X8pin connectors + one 6pin connector for a card that doesn’t need more than 2x6pin connectors. IF(big if) Fury beats 980 Ti, I am expecting to see many reviews with OCed 980 TIs in their charts.
Its not even just the
Its not even just the manually overclocked 980ti’s I would worry about. Ever since Nvidia introduced their GPU boost tech, there have been cases where I rolled my eyes at benchmark comparisons. A really good benchmark comparison site is anandtech bench, but if you compare a 290 or 290x to a 970, the 290/x cards are reference cooled/clocked while the 970 is an EVGA FTW card with an out of the box boost clock of 1410Mhz with stock bios (at least for anandtech’s review sample). Granted, the reference cooled hawaii GPU’s don’t have a lot of overclocking headroom unless you get a golden chip, and testing an nvidia card with boost left on is completely fair because its how the card works out of the box. I find AMD’s powertune technology to be really good and probably more advanced than nvidia gpu boost, but they really need to adopt the overclocking nature of gpu boost into powertune. Even though it ruins some of the fun of overclocking, I think AMD really need to go that direction.
Hi Ryan!
Out of curiosity,
Hi Ryan!
Out of curiosity, what proof is there that AMD’s reference cooler for Hawaii isn’t as good as Nvidia’s? Yes it revs more, but it is only natural as it is quite a bit smaller than for example the GK110, and at the same time draws more power.
That being said, I haven’t seen a test that confirms that Nvidia’s cooler would do a better job at that (cooling Hawaii that is). One might argue that the third party coolers succeeds in keeping it cooler, but they are larger and the trend is still there, it’s harder to keep Hawaii’s temperatures down.
Other than that, keep up the good work! 🙂
I don’t see why you would
I don’t see why you would want an air cooler. How big are CPU coolers that can dissipate ~300 W? Water cooling just makes good design sense for GPUs. Due to the current form factor, GPUs are limited to a tiny area while GPUs have a huge area with case exit fan right behind it. Nvidia has already shown their future plans for a mezzanine type connector rather than a slot connector. We will get new firm factors, but the current form factor is still skewed towards CPUs being the most important component which is no longer the case. For gaming, I would not even bother overclocking the CPU unless I was trying to use a really low end CPU.
There is almost certainly some marketing at play here. AMD wants this to be seen as a revolutionary product, which it is. Extending the cooler to do air cooling would make it look just like every other card on the market. This design, with its unique look and small size gets the point across that this is something different.
“the GREEN/D NVIDIA press
“the GREEN/D NVIDIA press found the excuse they needed to direct reader’s attention to the cooler. In a Hi End card’s review this was THE FIRST TIME that performance was something secondary. Completely and utterly ridiculous.”
I fear you might be misremembering, maybe you should check out a certain Mr Shrout’s articles on the matter?
What happened was that AMD shipped a reference card with insufficient cooling to retain the initial clock speeds for any length of time. Hence, performance suffered if the card was used for more than a few minutes at a time, and any testing that enabled the card to cool down between benchmark runs produced results not indicative of real game play.
As such, performance was very much at the centre of the issue.
You mean like Titan X
You mean like Titan X throttling?
I can’t seem to remember the
I can’t seem to remember the Titan X being a thing when Hawaii hit the market in late 2013.
I think the point was that
I think the point was that the Titan X, today, is experiencing the same issue that Hawaii experienced in late 2013 – thrashing the competition but running really hot and thermally-throttling under a very loud reference cooler – and yet the tech press is either glossing over that fact or ignoring it completely, instead of making it the entire focus the way they did with Hawaii reviews.
IOW: for Hawaii it was, “OMG look at how hot and loud this thing is (regardless of performance)!”
For Titan X it’s, “OMG look at how powerful this thing is (regardless of the heat and noise)!”
And this is not the only
And this is not the only example of double standards from the press.
Nvidia recently fad many problems with their drivers, beta AND WHQL, going from new version to new version, even posting a fix for one of their latest drivers. Not to mention 700 series performance issues with the latest drivers, after version 347. Did anyone read about that anywhere?
On the other hand the fact that AMD haven’t produced a WHQL driver but only Betas has been promoted as a major issue by many websites, like for example TechPowerUp. Of course the fact that there where no problems with stability on AMD Beta drivers was ignored. The only issues reported where with Nvidia GameWorks and whose where about performance in specific, Maxwell friendly titles.
For one more time the press used double standards. They ignored the main thing about drivers, which is stability, and tried to drive the reader’s attention on the title, if that title was having the word beta or the WHQL.
That’s why I am SCARED of the press with the Fury card reviews, that’s why I believe AMD choose to show the card FIRST to the public and latter to the press.
AMD’s “new” stance on drivers
AMD’s “new” stance on drivers is ridiculous. They haven’t had an update in 6 months, and nobody can tell me that 6+ months old drivers are properly optimized for games coming out today.
I didn’t consider providing regular updates in form of properly tested and released drivers optional when I bought my R9 290X.
Performance is one thing,
Performance is one thing, stability another. If you prefer Nvidia’s approach where they offer performance enhancements for 900 series completely with bugs and instability even with the WHQL drivers, and only bugs and instability for 700 series or older cards, fine.
But I prefer losing a couple of fps here and there but having an 100% stable driver in my system that doesn’t throws BSODs, doesn’t locks the system or turns the screen black.
Also AMD doesn’t have six months to give a driver. The last driver is less than a month old. So it is a completely lie what you say about 6 months without drivers.
The Beta on the driver’s title means nothing when that driver is totally stable. The same is true for the WHQL. WHQL means nothing when there are instability problems all over the place.
“WHQL means nothing when
“WHQL means nothing when there are instability problems all over the place.”
Agreed, this is not a performance vs. reliability thing, cause even 14.12 isn’t stable and has crashed numerous times for me over the past 6 months just browsing the web.
Beyond that, I prefer my GPU vendor to offer continued driver support for the hardware I bought. When I purchased my 290X AMD was offering just that, only to go back on that promise full-speed last December.
And no, releasing a couple of beta drivers over the past 6+ months doesn’t qualify as hardware support. I know first hand how crappy AMD’s fully tested and properly released drivers are, so I can only imagine the fun the beta ones are, if AMD can’t be bothered to have them tested and released regularly.
I’ve seen the Titan X being
I’ve seen the Titan X being measured uncomfortably loud under load, does it really throttle below its minimum clock, too?
And as far as coolers are concerned, I learned my lesson with the GTX 8800: If it’s got a blower, I won’t buy it. Blowers are evil.
The nvidia gimp boy strikes
The nvidia gimp boy strikes again.
Nvidia trolls are so sad when
Nvidia trolls are so sad when they come bashing to AMD forums. I guess they feel threatened and scared that AMD might come up with a better product.
it’s a brick… a goddamn
it’s a brick… a goddamn brick. I wouldn’t buy it just because of how damn ugly it is.
Maybe there will be
Maybe there will be aftermarket pimp kits for mono-brows like yourself to dress it up to meet you irrational expectations. It’s a GPU not a trophy, that brick will fit into much smaller from factor cases. I’ll take a few of those bricks, and the game play will be beautiful. It’s not how it looks, it’s how it plays, for what is paid.
just dont buy it and VOILA!!
just dont buy it and VOILA!! done.. dont have to scream it to the world kiddo
Yeah, so many things made by
Yeah, so many things made by humans are just square. You would think we could be a bit more imaginitave.
I guess there is the trash
I guess there is the trash can Mac pro from Apple…
Nice. Finally it arrived.
Nice. Finally it arrived. Potential for CF with true 1 slot config. With DVI gone its finally a possibility. Hooray for that! No more wasted slots. To be honest I’m bloody sick and tired of gargantuan 280mm+ VGA cards which just are disaster waiting to happen. Its ironic but I own MSI TF3 GTX580 probably longest single GPU-card ever produced, but it’s more than enough for my gaming. With these AMD beauties I will most likely switch – with a proviso that drivers are working which is real pain of biblical proportions with AMD (I have few “Red” VGAs).
Josh do something about it! 😛
Considering the watercooling
Considering the watercooling at 2x8pin power, this is going to be one hungry beast.
I wouldn’t expect much overclocking headroom either. Knowing AMD, they’re already pushing it with “up to” performance with the watercooling being required to prevent the card from burning up.
Replace it with a custom loop, and there may be overclocking headroom.
Its 1 thing to go a water
Its 1 thing to go a water cooler on a card cause you can. Its kinda of another doing water cooler on a card cause you have no choice to prevent the card from frying itself or throttling. Likely for AMD the ladder is reason for the water loop. Kinda does say power draw is gonna be pretty high.
So Nvidia can only compete on
So Nvidia can only compete on power consumption? That would make sense considering that Fiji has a lot more resources than a Titan X.
What’s the reason for the
What’s the reason for the “ladder”? For you, it’s the ladder you use to reach Mom’s cookie jar, and swipe the cash to get the overpriced Nvidia Kit. You are a paid troll for you green overlords, and only for a few dog biscuits, Good doggie, Good boy! pat, pat here go fetch!
No one pays me a dime for
No one pays me a dime for anything.
I don’t think air cooling was
I don’t think air cooling was much of an option. If you look at the size of this card, a blower the size of the one used on the Titan X would take up almost all of the card. Same thing with a fan. You could only put one fan on there, and it would take most of the space. You wouldn’t have enough space for the actual heat exchanger.
Who cares how it looks? 3rd
Who cares how it looks? 3rd party sellers will likely spruce it up, hopefully
I do hope you are right, but
I do hope you are right, but I fear that the cost of this cooler and the rumors about limited availability will mean that its reference only for quite some time.
I’ll be interested to see how
I’ll be interested to see how they price this thing competitively and STILL make money…..They really need to make money….
What’s that fan? Looks
What’s that fan? Looks nothing like the one Chip Hell shown in their unboxed sample.
What happened shrout, did you
What happened shrout, did you not insinuate Fiji does not exist only few mounts back when AMD refused to show it to you? Or was it your oversized ego overpowering your limited intellect making you come to stupid conclusions as usual?
Are you jockeying for the
Are you jockeying for the head troll position? Is that why you mention the horses?
I am wondering if i could
I am wondering if i could integrate that card in an existing water cooling system :-/
There will probably be water
There will probably be water cooled versions for custom loops from other manufactures. I don’t know if they will make an air cooled version though. The card is so small that making a sufficient air cooler in that size would be difficult unless they extend the cooler beyond the end of the card. Perhaps they could just cover the card with cooling fins and put a blower that extends out beyond the end of the card. They could have the blower portion be removable if you are using water cooling for a combo card (water or air).
Looks like a Beast of a card.
Looks like a Beast of a card. 2 8 pins is reasonable in light of running both the gpu and the entire liquid cooling system.
As for the fans, if you check Cooler Guys you can still buy the OEM version of the Gentle Typhoon in various speeds. Nidec makes them.
I have 12 AP15 GT fans in my custom water cooling system and they are superb, robust fans.