You are probably wondering what kind of performance you will see when you run the new Deus Ex after you purchase it; as obviously you did not pre-order the game. TechPowerUp has you covered as they have tested the retail version of the game with a variety of cards to give you an idea of the load your GPU will be under. They started out testing memory usage with a Titan, running Ultra settings at 4K will use up to 5.5GB of memory, so mid range cards will certainly suffer at that point. Since not many of us are sporting Titans in our cases they also tried out the GTX 1060, 980Ti and 1080 along with the RX 480 and Fury X at a variety of settings. Read through their review to garner a rough estimate of your expected performance in Mankind Divided.
"Deus Ex Mankind Divided has just been released today. We bring you a performance analysis using the most popular graphics cards, at four resolutions, including 4K, at both Ultra and High settings. We also took a closer look at VRAM usage."
Here is some more Tech News from around the web:
- Dawn of War 3: The most promising take on Warhammer 40K yet @ Ars Technica
- Total Warhammer: Grim & The Grave DLC Announced @ Rock, Paper, SHOTGUN
- Waaagh! WH40K: Armageddon – Da Orks Released @ Rock, Paper, SHOTGUN
- AMD and Nvidia tempt customers with new game bundles @ HEXUS
- How To Skip Deus Ex: Mankind Divided Intro Videos @ Rock, Paper, SHOTGUN
- Playerkind Divided – How’s Deus Ex Running For You? @ Rock, Paper, SHOTGUN
- Ubisoft showcases Watch Dogs 2, For Honor, The Crew and more @ HEXUS
- Premature Evaluation: Rogue System @ Rock, Paper, SHOTGUN
- Divinity: Original Sin 2 Smartly Reinvents The RPG Party @ Rock, Paper, SHOTGUN
- Dishonored 2 Gamescom Trailer Shows Emily’s Skillz @ Rock, Paper, SHOTGUN
Notice how the 980ti is
Notice how the 980ti is lagging way behind? Seems like the same thing happened to me the last time I went to the green team. All the sudden your card gets real slow when the new series comes out. My experiences with the red team has been the opposite. The card seems to get faster over time as the driver matures.
The 980 ti lagging probably
The 980 ti lagging probably has nothing whatsoever to do with this title being an AMD Gaming Evolved title. Meaning highly optimized for AMD and Nvidia little to no optimization. Same things AMD fanboys accuse Gameworks of doing. We’ll see what the benchmarks look like in a few weeks from now.
There is one difference
There is one difference between a Gaming Evolved title, and a gameworks title…
AMD just has a promotional agreement with game’s publisher. AMD does not prevent the developers from working and coding with the competition, gameworks contracts DO prevent developers from working with AMD.
As publicly stated by Witcher 3 developers CD Projekt RED.
Gameworks are licensing
Gameworks are licensing issue. CDPR cannot optimized Hairworks for AMD hardware because AMD did not paid any licensing fee to get access to nvidia gameworks. but apart from that they did not prevent AMD to work directly with developer. don’t twist what has been said by CDPR about hairworks to suit your own logic. even AMD said they have been working with CDPR since the very beginning of game development. if AMD are not allowed to work with CDRP then AMD will not be able to overwrite tessellation use in the witcher 3. because to do that AMD needs permission and work together with game developer so it is possible to override the setting from CCC. did you think AMD can simply override tessellation setting in any games without game developer permission? case in point AMD was complaining about Batman Arkham Origin regarding tessellation override because developer did not allow them to change tessellation setting that being used.
You left out the past where
You left out the past where nvidia revised the gameworks DLL libraries the games uses just before the game launch to make sure AMD did not have time to profile the game.
but they did not outright
but they did not outright preventing developer to work with AMD like some people claim. and you think nvidia are the only one doing this sort of thing? Tomb Raider 2013 says hi. nvidia did not have access to final build of the game causing them to have poor performance in TressFX. and with Dragon Age 2 (another title under AMD) nvidia does not have access to the game build at all until the game officially comes out.
How about the recent Total
How about the recent Total War Warhammer game. Early benchmarks had both AMD and Nvidia with good performance.
http://www.pcgamer.com/total-war-warhammer-benchmarks-strike-fear-into-cpus/
AMD scoops it up with 2 months to launch and guess what happened. Nvidia performance suddenly underwent regression. AMD wouldn’t do that would they.
compared to what? looking at
compared to what? looking at guru3d bench for the game (which include 1070) GTX980ti perform about the same as 1070 in this game so definitely no “lag behind” like you think it was. fury x is to be expected to have a bit of advantage in this game seeing the game using modified Glacier 2 engine (Hitman) which is favor AMD GCN architecture heavily.
Lagging behind? Ha..
Custom
Lagging behind? Ha..
Custom 980 Ti beats Custom 1070:
http://www.pcgameshardware.de/Deus-Ex-Mankind-Divided-Spiel-55470/Specials/Benchmarks-Test-DirectX-12-1204575/
I pre ordered and really glad
I pre ordered and really glad I did. This game is awesome!
Benchmarks are all over the
Benchmarks are all over the place. Why is that?
Most test systems are either Skylake-based, or BW-E; it’s hard to imagine why there is such a huge fluctuation in performance in a canned bench between different review sites?
Also, seeing GPUs performing worse in DX12 compared to DX11 is just sad. Can we just call it quits and run everything on Vulkan, where both IHV’s see performance gains?
Some differences can be based
Some differences can be based on the systems used by different reviewers.
For example, some companies will have a clean image with everything up to date, and no dedicated GPU ever installed (everything done on the onboard video with whatever generics windows provides).
That image is then loaded before each new GPU is tested.
If multiple will be tested, the same image can be loaded to like 4 different SSDs, and they simply swap the SATA cable and load a different card.
Then there are others who simply swap cards, and uninstall the AMD/ATI drivers, and install the Nvidia drivers when switching cards, or running new build, and just loading the stock motherboard driver CD (which tends to also come with some massively unoptimized AV trial or other crap that negatively impacts performance).
For DX12 and DX11, these companies will have to pick a single one, and focus entirely on it in order to improve performance.
I’m definitely waiting for
I’m definitely waiting for the DX12 version for the full experience.
Interesting to see the RX480
Interesting to see the RX480 cream the GTX1060 here, and this isn’t even DX12 (yet).
the game use modified Glacier
the game use modified Glacier 2 engine used in hitman. it is well known how the engine favors GCN architecture. for those that aware about this has been expecting this out come long before the game coming out.
Thats a pointless statement.
Thats a pointless statement. What happens when they use it in another game and another? What it really means is the playing field is changing and Nvidia architecture is being left in the dust.
nothing really change really.
nothing really change really. just another game engine that favors GCN. just like game engine use in project cars favors nvidia architecture more.