Hitman (2016)
Hitman (2016) (DirectX 12)
Hitman is a third-person stealth video game in which players take control of Agent 47, a genetically enhanced, superhuman assassin, travelling to international locations and eliminating contracted targets. As in other games in the Hitman series, players are given a large amount of room for creativity in approaching their assassinations.[5] For instance, players may utilize long-ranged rifles to snipe a target from a long distance, or they may decide to assassinate the target at close range by using blade weapons or garrote wire. Players can also use explosives, or disguise the assassination by creating a seemingly accidental death. –Wikipedia
Settings used for Hitman
CrossFire doesn’t scale in Hitman running in DX12, with no change in performance, positive or negative.
Radeon Vega FE CrossFire, Average FPS Comparisons, Hitman | |||||
---|---|---|---|---|---|
Vega Frontier Edition | GTX 1080 Ti | GTX 1080 | Fury X | ||
2560×1440 | -1% | -30% | -15% | +25% | |
3840×2160 | +0% | -33% | -16% |
This table presents the above data in a more basic way, focusing only on the average FPS, so keep that in mind.
Wow, optimized for Xfire and
Wow, optimized for Xfire and gaming or not, TWO of these things still get destroyed by 1080 Ti! Vega is off to a fugly start. Glad I didn’t wait (and wait…and wait…)
While the design is similar,
While the design is similar, you have to look at intended use case. Vega FE and RX Vega are going to be 2 different graphics cards. Vega FE is a pro level card you can play games on. RX Vega is going to be a top performing gaming card. I’m not saying I have any info anyone outside of AMD doesn’t, but I’m looking forward to see what RX Vega is actually capable of.
I am hoping to buy the Vega
I am hoping to buy the Vega FE for my 2009 Mac Pro for editing with FCPX, we all know a Mac running FCPX destroys any Windows PC’s on render times, Im currently running a Maxwell Titan X as they have good open CL performance, and I have upgraded my Cameras to record 10 BIT 4.2.2 This plays perfectly smooth in my 8 1/2 year old Mac Pro at 4K even before its optimised to Pro Res. so have been hoping to see some real open CL tests of this workstation card.
Everyone seems to be testing it for gaming, wtf, Its like buying a Van and testing it against hot hatch’s, and worse still testing it with DX11 API, the worst API ever, anyone coding for DX11 in 2017 needs shooting, DX11 uses single core, how many people have dual core CPUS now.
I currently run 4 networked Computers, a 12 core (24 HT) heavily modded 2009 Mac Pro, 8 Core (16HT) Full open loop water cooled 5960X OC 4.4 GHz X99, 4 Core (8HT) 4790KOC 4.6GHz 4 core (8HT) 2600K OC 4.4GHz, They are all networked and Connected to multiple 50′ and 65′ Panasonic 4K THX certified smart TV’s, that have 99.3 % of the sRGB colour gamut at default Also have an ASUS MG279Q 2KIPS 144Hz gaming monitor, for times when I’m not working.
But the AMD VEGA Frontier Edition used with Final Cut Pro will be a monster, for us professional film makers and broadcasters. we know Apple plan to use Vega graphics in the IMAC PRO, scheduled for release in December, running the latest High Sierra OS, so looks like the drivers will be there to run it on my 2009 Mac Pro, with High Sierra OS as well.
Enjoy your gaming, I don’t really game that much, but please test the Vega on DX12 and Vulkan games, it’s meant to work with, not the poorly optimised DX11 API, and that new Tomb raider title is a disaster, whoever pretended they coded that for DX12 should give the money back and never work again. there is a distinct difference between something working with DX12 and being Optimised for it.
You didn’t undervolt them. You can expect a smoking hot card(s) like that to not downclock because of heat. I expect you were running 500mhz hmb2 speeds just about all the time.