Lo and behold! The dual-Fiji card that we have previous dubbed the AMD Radeon Fury X2 still lives! Based on a tweet from AMD PR dude Antal Tungler, a PC from Falcon Northwest at the VRLA convention was utilizing a dual-GPU Fiji graphics card to power some demos.
Prototype Tiki from @FalconNW powering #htcvive with dual Fiji @AMDRadeon at the #vrla pic.twitter.com/2gCxgzucB5
— Antal Tungler (@coloredrocks) January 23, 2016
This prototype Falcon Northwest Tiki system was housing the GPU beast but no images were shown of the interior of the system. Still, it's good to see AMD at least recognize that this piece of hardware still exists at all, since it was initially promised to the enthusiast market by "fall of 2015." Even in October we had hints that the card might be coming soon after seeing some shipping manifests leak out to the web.
Better late than never, right? One theory floating around inside the offices here is that AMD is going to release the Fury X2 along with the VR headsets coming out this spring, with hopes of making it THE VR graphics card of choice. The value of using multi-GPU for VR is interesting, with one GPU dedicated to each eye, though the pitfalls that could haunt both AMD and NVIDIA in this regard (latency, frame time consistency) make the technological capability a debate.
Like others have stated in
Like others have stated in one of the articles’ comments: How the hell do they plan on cooling the damn thing?
closed loop, just like the
closed loop, just like the 295 X2 and Fury X most likely. Why would they do something different?
The dual Fiji card is
The dual Fiji card is rumoured to have a TDP 75W less than than the R9 295 X2, so AMDs existing cooling solutions should be more than sufficient.
correct me if im wrong here,
correct me if im wrong here, but i was lead to believe the Fury X2 is going to have a TDP around the 375w area, which is 125w lower than the 295X2,not 75w
OK you are corrected 🙂
From
OK you are corrected 🙂
From the following article and several others it appears the TDP of the R9 295X2 stated by AMD was 450W and it actually measured slightly lower than that when measured during testing. I think the 500W value came from the card’s cooling solution, this is rated at being able to cope with a 500W TDP.
http://www.tomshardware.co.uk/graphics-card-power-supply-balance,review-33071-5.html
In an AnandTech article back
In an AnandTech article back in December AMD gave a quote saying they had delayed dual Fiji to align with the launch of VR headsets.
With the Crossfire
With the Crossfire performance the FURY cards have shown and Async performance for reduced lag, I have to agree the Dual Fury X2 looks to be the PERFECT card for VR.
Maybe not for Desktop use, but certainly for VR where 2 GPUs is actually better than 1!
Cause nothing says pc master
Cause nothing says pc master race like a 1200 usd graphics card in a system hooked up to a 600 usd peripheral.
Bwahahaha
Bwahahaha
Nonono… $1,200
Nonono… $1,200 ppppfffffftttt! Cheapskate! 2x Titans for $2,000!
And people think Apple stuff is to expensive ROFL!
If Arctic Islands weren’t
If Arctic Islands weren’t coming out this summer I’d have definitely got a Fury X.
I don’t know about the Fury X2 as some of the main games that I play don’t optimize for multi-GPU configs, but I’d love to see what kind of power this beast has when it’s launched.
Will the Fury still be the
Will the Fury still be the top of AMD’s line-up after release of the 14 nm parts? I kind of expected that they would release a low and mid-range part this year, but I was expecting the mid-range size die to be close to the current top end due to the jump in process tech. I guess even if the Fury is outperformed by upcoming parts, the dual Fury may still be on top. Even after the release of the Fury, it was often outperformed for a similar price by the Radeon 295×2.
I wouldn’t pick up a dual-GPU
I wouldn’t pick up a dual-GPU card or setup for the initial crop of VR titles: simply nobody outside of Nvidia (and now AMD) have demonstrated an actual functioning VR dual-GPU demo. And unlike with current SLI/Crossfire, dual-GPU cannot be ‘retrofitted’ for VR, it needs to be designed in and optimised by the engine and game developers to ensure job dispatch is performance din a way that actually reduces latency, rather than increases. That’s going to take time.
I will pass until GPUs go
I will pass until GPUs go down to 14nm parts and HBM2. I like what AMD are trying to do with GPUs
Maybe in fall 2016 when I can custom water loop one
Shouldn’t dual gpu for VR be
Shouldn’t dual gpu for VR be relatively easy? One gpu, one screen. Admittedly I know nothing about how the software side works.
Come to me.)))
Nice Card, i´m
Come to me.)))
Nice Card, i´m in love.)