The Witcher 3
The Witcher 3 (DirectX 11)
Played in a third-person perspective, players control protagonist Geralt of Rivia, a monster hunter known as a witcher, who sets out on a long journey through the Northern Kingdoms. In the game, players battle against the world's many dangers using swords and magic, while interacting with non-player characters and completing side quests and main missions to progress through the story. The game was met with critical acclaim and was a financial success, selling over 6 million copies in six weeks. The game won multiple Game of the Year awards from various gaming publications, critics, and game award shows, including the Golden Joystick Awards, The Game Awards, Game Developers Choice Awards, and SXSW Gaming Awards. –Wikipedia
Settings used for The Witcher 3
Scaling of CrossFire in The Witcher 3 is solid – 42% at 2560×1440 and 75% at 4K. This also gives the Vega combination a performance advantage over the GTX 1080 Ti but 4% or so. That being said, frame time variance is…a problem. At 2560×1440 we see 3.5ms of variance at the 95th percentile, which is a noticeable delta when running under 10ms frame times. But at 4K that jumps to 8.5ms at the 95th percentile!
Radeon Vega FE CrossFire, Average FPS Comparisons, The Witcher 3 | |||||
---|---|---|---|---|---|
Vega Frontier Edition | GTX 1080 Ti | GTX 1080 | Fury X | ||
2560×1440 | +42% | -13% | +18% | +64% | |
3840×2160 | +75% | +4% | +46% |
This table presents the above data in a more basic way, focusing only on the average FPS, so keep that in mind.
Wow, optimized for Xfire and
Wow, optimized for Xfire and gaming or not, TWO of these things still get destroyed by 1080 Ti! Vega is off to a fugly start. Glad I didn’t wait (and wait…and wait…)
While the design is similar,
While the design is similar, you have to look at intended use case. Vega FE and RX Vega are going to be 2 different graphics cards. Vega FE is a pro level card you can play games on. RX Vega is going to be a top performing gaming card. I’m not saying I have any info anyone outside of AMD doesn’t, but I’m looking forward to see what RX Vega is actually capable of.
I am hoping to buy the Vega
I am hoping to buy the Vega FE for my 2009 Mac Pro for editing with FCPX, we all know a Mac running FCPX destroys any Windows PC’s on render times, Im currently running a Maxwell Titan X as they have good open CL performance, and I have upgraded my Cameras to record 10 BIT 4.2.2 This plays perfectly smooth in my 8 1/2 year old Mac Pro at 4K even before its optimised to Pro Res. so have been hoping to see some real open CL tests of this workstation card.
Everyone seems to be testing it for gaming, wtf, Its like buying a Van and testing it against hot hatch’s, and worse still testing it with DX11 API, the worst API ever, anyone coding for DX11 in 2017 needs shooting, DX11 uses single core, how many people have dual core CPUS now.
I currently run 4 networked Computers, a 12 core (24 HT) heavily modded 2009 Mac Pro, 8 Core (16HT) Full open loop water cooled 5960X OC 4.4 GHz X99, 4 Core (8HT) 4790KOC 4.6GHz 4 core (8HT) 2600K OC 4.4GHz, They are all networked and Connected to multiple 50′ and 65′ Panasonic 4K THX certified smart TV’s, that have 99.3 % of the sRGB colour gamut at default Also have an ASUS MG279Q 2KIPS 144Hz gaming monitor, for times when I’m not working.
But the AMD VEGA Frontier Edition used with Final Cut Pro will be a monster, for us professional film makers and broadcasters. we know Apple plan to use Vega graphics in the IMAC PRO, scheduled for release in December, running the latest High Sierra OS, so looks like the drivers will be there to run it on my 2009 Mac Pro, with High Sierra OS as well.
Enjoy your gaming, I don’t really game that much, but please test the Vega on DX12 and Vulkan games, it’s meant to work with, not the poorly optimised DX11 API, and that new Tomb raider title is a disaster, whoever pretended they coded that for DX12 should give the money back and never work again. there is a distinct difference between something working with DX12 and being Optimised for it.
You didn’t undervolt them. You can expect a smoking hot card(s) like that to not downclock because of heat. I expect you were running 500mhz hmb2 speeds just about all the time.