[H]ard|OCP used a slightly different configuration to test the new R9 Fury X, an i7-3770K on an ASUS PB287Q as opposed to an i7-3960X and an ASUS P9X79, the SSD is slightly different but the RAM remains the same at 16GB of DDR3-1600. [H] also used the same driver as we did and found similar difficulties using it with R9-2xx cards which is why that card was tested with the Catalyst 15.5 Beta. When testing The Witcher 3 the GTX 980 Ti came out on top overall but it is worth noting the Fury's 70% performance increase over the 290X when HairWorks was enabled. Their overall conclusions matched what Ryan saw, read them for yourself right here.
"We review AMD's new Fiji GPU comprising the new AMD Radeon R9 Fury X video card with stacked chip technology High Bandwidth Memory. We take this video card through its paces, make comparisons and find out what it can do for us in real world gameplay. Is this $649 video card competitive? Is it truly geared for 4K gaming as AMD says?"
Here are some more Graphics Card articles from around the web:
- AMD's Radeon R9 Fury X @ The Tech Report
- AMD R9 Fury X Review; Fiji Arrives @ Hardware Canucks
- AMD Fury X @ HardwareHeaven
- AMD Radeon R9 Fury X 4 GB @ techPowerUp
- MSI R9 390X GAMING 8G @ [H]ard|OCP
- MSI R7 370 GAMING 2G Review @ Neoseeker
- PowerColor PCS+ R9 390 8GB Review @ OCC
- PowerColor TurboDuo R9 290 4GB OC @ [H]ard|OCP
- EVGA GTX 980 Ti SC+ 6 GB @ techPowerUp
- EVGA GTX 970 SSC @ HardwareHeaven
Hello Jeremy,
I’d like to
Hello Jeremy,
I’d like to know whether or not there might be any truth to this post on reddit: http://www.reddit.com/r/pcmasterrace/comments/3b2ep8/fury_x_possibly_reviewed_with_incorrect_drivers/
Although it was Ryan that actually wrote the review, I’d like to know that, if this post is true, does it impact your opinion of the Fury X?
We aren’t done testing it,
We aren't done testing it, there are several remaining questions with drivers being one area of investigation.
AMD debunked the “wrong
AMD debunked the “wrong driver” rumor.
https://www.reddit.com/r/hardware/comments/3b518c/amd_rep_denies_wrong_driver_rumor/
+ or – 5fps per game is still
+ or – 5fps per game is still pretty much an even match and will get even better with driver maturity. Give it a month or two and revisit. I’m positive you will see the card beating the 980TI.
FuryX running near silent at
FuryX running near silent at 50c full load is worth the negligible fps difference. In comparison the 980ti runs hot and is loud imo.
Is it silent ? I have not
Is it silent ? I have not really seen any data supporting that. I currently have dual(crossfire) 7970 setup that is LOUD
If I could get the same performance out of a r9 fury x as that and get it substantially less noisy I would jump at it.
Compared to 2x 7970 its much
Compared to 2x 7970 its much quieter.
http://techreport.com/r.x/radeon-r9-fury-x/noise-load.gif
well not by much according to
well not by much according to this.
http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,13.html
980ti Hybrid totally
980ti Hybrid totally different. AND A FAIRER COMPARISON
it costs about the same as a normal ti now cause of the Nvidia price drops. and it overclock reliably to 50% over stock lol so 1000mhz to 1500mhz! and DECIMATED THE FURYX that no one can buy lol
new egg confirmed it received 100 units for launch. in australia the biggest retailer Umart online, hasnt even made a listing as its not here AT ALL lol
Everyone is ignoring the
Everyone is ignoring the $100-200 you’ll save with Freesync vs. GSync when you go Fury. There is value there.
yes, but you give up frame
yes, but you give up frame multiplication when framerate dips below lower end of the VRR window of a monitor. Until AMD implements this on their cards, gsync will remain the better solution.
My point is not to compare
My point is not to compare them. People will likely looking at total outlay and will be updating monitors. $100-200 matters.
You have to compare them, the
You have to compare them, the difference in the quality of the experience is well worth it. Quite a few reviews have shown that when freesync drops below the VRR window, the transition is quite jarring.
I don’t really think that is
I don’t really think that is a very valid argument. If you have the horsepower to drive a high-quality display such as an ROG swift, it would seem to me that you shouldn’t be dropping below its window very often if you’ve tuned your settings properly.
I’ve had a freesync monitor for about a week now (Acer XG270HU) using an R9 290 and I haven’t noticed any harsh transitions, though I think I’ve only been playing in the range of ~60-144 in games like GTA V and Payday 2.
Precisely. The point that
Precisely. The point that seems to elude funandjam is that both technologies improve the gaming experience markedly, and thus, if you’re going to have to pick a lane, the benefit with Freesync/Fury is the $100-200 savings PER MONITOR. That is significant. funandjam, you can play the snob card here and say that it is worth and that is or opinion, but LOTS of folks will argue that it isn’t. If your goal is to run triple monitors, then you just about bought yourself a second Fury with the savings.
It *IS* an important point and I think it is negligent on the part of the reviewers to fail to bring it to the forefront.
You can pick up Gsync
You can pick up Gsync monitors for under 300. The price isn’t as astronomical as you’re making it out to be.
Quality means a lot to some
Quality means a lot to some people, and most all people already spending 600-1200$ on gpu and monitor alone will consider the extra 1-200$ for a noticiable increase in quality. A friend of mine replaced his 290 with a gtx 970 after borrowing my 970 for just a few days. He noticed something I cant relate, I havent had AMD since my hd 4670, which I LOVED. Whenever I hear people defending AMD nowadays I can only think they have never tried the other side.
All those people that
All those people that complain in the GeForce Forums certainly aren’t defending AMD.
Compare the AMD vs Nvidia
Compare the AMD vs Nvidia forums and Nvidia is overflowing with complaints such as BSOD, artifacts and hard locks.
So people that spend this
So people that spend this kind of money would they care the save they can have when buying monitors?
That’s a good point. People
That’s a good point. People in this market tend to just go for it and not care for the money. However, a very friendly number such as 100 or 200 dollars saving can sway people toward AMD side because of the fact that they are thinking they are saving 100, 200, 300, or 400 dollars on the number of monitors they buy. These are very easy numbers to grasp and may have a big enough impact.