Synthetics and Closing Thoughts
Our synthetic 3DMark scores show impressive scaling, 87% in Fire Strike Extreme and 91% in Fire Strike Ultra – very respectful and AMD usually does very well in Futuremark tests.
Unigine Heaven scales at 88% for the Radeon Vega Frontier Edition cards in CrossFire, though that barely is enough of an increase to overtake the single GeForce GTX 1080 Ti.
Closing Thoughts
The only test from our professional results page of the single GPU review that scales with multiple GPUs is LuxMark. Our score with the pair of Radeon Vega Frontier Edition graphics cards was 10042, a difference of 2.14x. (Note that SPECviewperf and Cinebench do not utilize multiple GPUs in their benchmarks.)
As I stated at the beginning of this story, the performance measurements we see here today should really be taken as a “current state” of CrossFire and multi-GPU for the Vega architecture. There likely will be performance and scaling improvements in the CrossFire architecture by the time we see Radeon RX Vega in gamers’ hands. To what degree won’t be known until launch and the succeeding drivers after that point, but AMD has previously touted its dedication to multiple GPUs as a major part of the future of graphics in an interview I did with Raja Koduri last year.
The current state of CrossFire for Vega isn’t amazing me. Scaling at 2560×1440 is abysmal with only The Witcher 3 going above 20% improvement by adding the second card. At 4K we do see a better result, ranging from 54% to 84% depending on the game. Hitman in DX12 was the outlier though, with zero scaling, indicative of the continued concerns over the new API and the pressure it puts on developers going forward.
Even if scaling works when looking at average frame rates, the frame time variance of many of our results are higher than we want to see. Take Witcher 3 as the perfect example – we see great scaling from the single Radeon Vega Frontier Edition to two in CrossFire but at the same time we have frame to frame variance going well above 10ms at the 95th percentile. That is a straight up bad experience, and though its front weighted in our testing scenario in Witcher 3, it was continuous in Dirt Rally and GTA V. As we stated at the outset, getting multi-GPU scaling correct is extremely difficult.
We are early in the CrossFire process for Vega, but it’s worth pointing out that position that NVIDIA has put them in. With two Radeon Vega Frontier Edition cards in CrossFire, AMD is having trouble keeping pace with a single GeForce GTX 1080 Ti. That presents problems for AMD in terms of pricing and positioning and leaves NVIDIA in a dominant spot, with the ability to hold serve and maintain performance leadership for the foreseeable future.
We still have several weeks to find out for certain how Vega will affect and shift the high-end GPU market but our looks at Vega Frontier Edition are giving us a peek into the future.
- Also read: AMD Radeon Vega Frontier Edition Full Review
Wow, optimized for Xfire and
Wow, optimized for Xfire and gaming or not, TWO of these things still get destroyed by 1080 Ti! Vega is off to a fugly start. Glad I didn’t wait (and wait…and wait…)
While the design is similar,
While the design is similar, you have to look at intended use case. Vega FE and RX Vega are going to be 2 different graphics cards. Vega FE is a pro level card you can play games on. RX Vega is going to be a top performing gaming card. I’m not saying I have any info anyone outside of AMD doesn’t, but I’m looking forward to see what RX Vega is actually capable of.
I am hoping to buy the Vega
I am hoping to buy the Vega FE for my 2009 Mac Pro for editing with FCPX, we all know a Mac running FCPX destroys any Windows PC’s on render times, Im currently running a Maxwell Titan X as they have good open CL performance, and I have upgraded my Cameras to record 10 BIT 4.2.2 This plays perfectly smooth in my 8 1/2 year old Mac Pro at 4K even before its optimised to Pro Res. so have been hoping to see some real open CL tests of this workstation card.
Everyone seems to be testing it for gaming, wtf, Its like buying a Van and testing it against hot hatch’s, and worse still testing it with DX11 API, the worst API ever, anyone coding for DX11 in 2017 needs shooting, DX11 uses single core, how many people have dual core CPUS now.
I currently run 4 networked Computers, a 12 core (24 HT) heavily modded 2009 Mac Pro, 8 Core (16HT) Full open loop water cooled 5960X OC 4.4 GHz X99, 4 Core (8HT) 4790KOC 4.6GHz 4 core (8HT) 2600K OC 4.4GHz, They are all networked and Connected to multiple 50′ and 65′ Panasonic 4K THX certified smart TV’s, that have 99.3 % of the sRGB colour gamut at default Also have an ASUS MG279Q 2KIPS 144Hz gaming monitor, for times when I’m not working.
But the AMD VEGA Frontier Edition used with Final Cut Pro will be a monster, for us professional film makers and broadcasters. we know Apple plan to use Vega graphics in the IMAC PRO, scheduled for release in December, running the latest High Sierra OS, so looks like the drivers will be there to run it on my 2009 Mac Pro, with High Sierra OS as well.
Enjoy your gaming, I don’t really game that much, but please test the Vega on DX12 and Vulkan games, it’s meant to work with, not the poorly optimised DX11 API, and that new Tomb raider title is a disaster, whoever pretended they coded that for DX12 should give the money back and never work again. there is a distinct difference between something working with DX12 and being Optimised for it.
You didn’t undervolt them. You can expect a smoking hot card(s) like that to not downclock because of heat. I expect you were running 500mhz hmb2 speeds just about all the time.