Testing Notes
We spent a couple of days with Shadow of Mordor to see how the latest GPUs from NVIDIA and AMD compare.
In what can most definitely be called the best surprise of the fall game release schedule, the open-world action game set in the Lord of the Rings world, Middle-earth: Shadow of Mordor has been receiving impressive reviews from gamers and the media. (GiantBomb.com has a great look at it if you are new to the title.) What also might be a surprise to some is that the PC version of the game can be quite demanding on even the latest PC hardware, pulling in frame rates only in the low-60s at 2560×1440 with its top quality presets.
Late last week I spent a couple of days playing around with Shadow of Mordor as well as the integrated benchmark found inside the Options menu. I wanted to get an idea of the performance characteristics of the game to determine if we might include this in our full-time game testing suite update we are planning later in the fall. To get some sample information I decided to run through a couple of quality presets with the top two cards from NVIDIA and AMD and compare them.
Testing Notes
Without a doubt, the visual style of Shadow of Mordor is stunning – with the game settings cranked up high the world, characters and fighting scenes look and feel amazing. To be clear, in the build up to this release we had really not heard anything from the developer or NVIDIA (there is an NVIDIA splash screen at the beginning) about the title which is out of the ordinary. If you are looking for a game that is both fun to play (I am 4+ hours in myself) and can provide a “wow” factor to show off your PC rig then this is definitely worth picking up.
Let’s talk about how I tested. All of our benchmark results here are based on the in-game benchmark mode that runs for about 45 seconds and does a fly-by over a portion of the map with characters moving around, interacting, explosions, rain effects, etc. However, we are NOT using the in-game benchmarks actual frame rate results reported at the end of each test. Those results were incredibly variable with our AMD test configurations (using both Catalyst 14.9 and 14.7 RC3 drivers) and at first I thought this would dictate us eliminating it from the test suite possibilities until I looked at our Frame Rating, capture-based performance results: they were nearly perfectly consistent. Clearly something in the games benchmark mode that is counting frames and counting frame times doesn’t sit well with AMD’s current driver and hardware but the actual experience of the gameplay was just fine. The frame rate graph in the upper right corner of the screen during the benchmark also shows a saw-tooth style when running on AMD cards which would indicate variable frame times; again that didn’t show up in our capture-based performance testing tools.
- Core i7-3960X
- ASUS Rampage IV Extreme X79
- 16GB DDR3-1600
- GeForce GTX 980 Reference (344.16)
- ASUS R9 290X DirectCU II (14.9, 14.7 RC3 spot checked)
There is a chance that the variable frame rates we saw with the AMD Radeon R9 290X and R9 290 are an indication of something else funky happening between the game and AMD’s driver. And it’s possible that might mean there are other problems that haven’t been exposed quite yet, but for now I am confident that you will have a solid gaming experience with both GeForce and Radeon cards on Shadow of Mordor.
When looking at our benchmarks you might note that we do not include multi-GPU results. Shadow of Mordor doesn’t work correctly with NVIDIA SLI or AMD CrossFire yet though I am told that the developer is working on a game update to fix that. I’ll circle back around at that point to see what changes. I should also note that some intrepid NVIDIA users have found some success forcing the FEAR 3 SLI profile on Shadow of Mordor but when asked about that option, NVIDIA stated they would not recommend it as it would very likely lead to graphical artifacts and crashing.
Finally, I spent a lot of time with the game and making sure that the in-game benchmark was representative of the performance curves seen throughout the game, and it appears to be. There have definitely been a handful of areas in the game, with its very dynamic nature, where the on-screen count of orcs and Uruks has been extreme and dragged down performance more than you’ll see in the benchmark. However, repeating those situations is basically impossible due to the dynamic AI working in the game and the auto-save functionality of the game after each encounter.
Very High Preset
Ultra Preset
The game was tested at both 2560×1440 and 3840×2160 resolutions and at both Very High and Ultra quality presets. The Ultra preset improves on the VH settings by increasing shadow quality from High to Ultra, texture quality from High to Ultra and bumping ambient occlusion from Medium to High. If you are planning on trying to test your hardware against our results then you should note that sometimes the settings menu can be finicky, not saving your settings the first time. You will have to enter the game, set the preset, exit the game and restart once again. The Ultra preset is enabled through an HD content download pack (at least through Steam) and includes much higher resolution textures. The recommended specifications for that HD content actually asks for a card with a 6GB frame buffer. But at 25×14 with a single GTX 980 4GB we were able to get more than 60 FPS on average at the Ultra preset.
is it safe to assume that
is it safe to assume that wider memory bus is what helps AMD chip get better performance for this particular game?
I would say so. This shows
I would say so. This shows across different benches.
defintely, The Amd 290x is a
defintely, The Amd 290x is a better performer than my 780ti at higher resolutions. The 256bit is the limiting factor here for the gtx 980, they are having to compress memory in order to perform better than the 256bit bus and at higher resoltuions 4k, 8k.. this algorithm can’t fix the hardware’s 256bit bus limitations.
Your theory does not stand
Your theory does not stand because the 290 has the same memory bus bandwidth than the 290x but has definitely less FPS.
Nvidia has used this game to demonstrate their cards but it turns out AMD year old cards has the upper hand.
And to complete the
And to complete the demonstration the r9-290 has better bus memory bandwith than the gtx980 but has lower over all performance. So the FPS is limited by the core chip and not the memory bus.
Well I just revised my post
Well I just revised my post and the benches on SoM today and I have to admit that the bus seems to be an important limiting factor and not the compute part as I first thought.
I cannot edit so here another post.
Nvidia increased their raster engines and lowered the compute ones which is more oriented toward games/triangles and textures (rasters). That means that a card like GTX 980 which has the same amount of ROPs (raster operation pipelines) now than the R9-290 and 290x but with much less compute cores can increase the frequency higher because of much less transistors producing heat. About 1 billion less transistors made possible with much less CUDA cores. So Less transistors less heat means higher frequency potential: 1126 mhz+ for 980 with the same amount of ROPs. This is where I think the 980 and 970 have the edge now with other games.
This can explain the difference in performance and I just saw something interesting looking through the specs; 290 and 290x have 160 and 176 TMUs (texture mapping units) compared to 128 for GTX 980 and this might help explain even more discrepancy with a game like SoM.
So the bus combined with texture units on the chip seems like the most important limiting factors.
And looking back at apexitt post the 290x would be a better performer at higher resolution but the 780ti has the same kind of bus and TMUs capacity but is lacking in ROPs which is important to process textures on triangles.
I was expecting more
I was expecting more graphically speaking, from this game. Oblivion and Skyrim, in my opinion are more realistic and gorgeous than this.
“realistic”
… i think your
“realistic”
… i think your fedora is on too tight, cutting of bloodflow and stuff
tell me your joking
tell me your joking
If I’m understanding you
If I’m understanding you correctly, Ryan, G-Sync is really only tangible at 4K currently given the lower FPS. Any other scenario would be like V-Sync on?
no, thats retarded…
Its
no, thats retarded…
Its tangible for everything, 0-1000hz, on all resolutions, VR, everything will be MASSIVLY better with gsync (or freesync IF its just as good)
It has been stated by Tom
It has been stated by Tom that G-Sync will be disabled below 30hz.
Anything below 30fps or above
Anything below 30fps or above the maximum refresh of the monitor will be like normal V-Sync (stutter). Currently, the only G-Sync-enabled monitors in the wild right now are:
4K @ 60Hz max
1440p @ 144Hz max
1080p @ 144Hz max
Once we get DisplayPort 1.3 4K monitors and GPUs, we will see 4K @ 120Hz. That’s the technology to wait for…
Assuming you are the same
Assuming you are the same person from earlier, you just restated what I had already said. I’m not entirely sure where your point of disagreement is because it seems as if we agree. How is my original post “retarded”?
No its not like vsync when
No its not like vsync when you hit the 30fps threshold becasue you have to disable vsync to use g-sync so it wont stutter will be prone to tearing
FYI if you force AFR modus 2
FYI if you force AFR modus 2 in the nvidia control panel you can run shadow of mordor with sli. Probably not perfectly optimized, but still a clear fps win over one card running.
I can get this game to run
I can get this game to run with my CrossfireX + Eyefinity set-up but ONLY in the benchmarking tool. When I actually load the game it crashes every time before completing the load.
Really fun game so far though and still manages to run at acceptable frames with only one R9 280.
What was the test system and
What was the test system and drivers used?
Sorry, added it on the first
Sorry, added it on the first page. Standard GPU test bed for us, drivers were 344.16 for NVIDIA and 14.9 for AMD.
There is a chance that the
There is a chance that the variable frame rates we saw with the AMD Radeon R9 290X and R9 290 are an indication of something else funky happening between the game and AMD’s driver. And it’s possible that might mean there are other problems that haven’t been exposed quite yet, but for now I am confident that you will have a solid gaming experience with both GeForce and Radeon cards on Shadow of Mordor.
Nvidia Gameworks shenanigans ?
To be clear, in the build up to this release we had really not heard anything from the developer or NVIDIA (there is an NVIDIA splash screen at the beginning) about the title which is out of the ordinary.
Keeping a low profile to minimize backlash. lol
They probably have a deal with the PC version since the console versions are all running on AMD hardware.
I think pcper uses reference
I think pcper uses reference 290/x. They will clock down if the reach 94c.
The game was showcased at the
The game was showcased at the recent Nvidia Game24 event for the launch of the 900 series.
“However, we are NOT using
“However, we are NOT using the in-game benchmarks actually frame rate results reported at the end of each test”
please fix to “actual”
In summary: this is kind of
In summary: this is kind of an amazing accidental “win” for AMD. I know people out there upgraded to the new nVidia 980 specifically for this game. If they waited for these stunning tests from Ryan they’d be a whole lot of cash better off + better performance.
Until the game and drivers
Until the game and drivers are updated to compensate…
by which they’d be done with
by which they’d be done with the game
I thought all the tests of
I thought all the tests of the 900 series on all the tech sites where showing that the new Nvidia cards where killing 200 series AMDs.
Should I have to start thinking about conspiracy theories after these results or maybe a few specific “optimizations” on the drivers from Nvidia that don’t work on newer titles?
What’s helping AMD is SoM
What’s helping AMD is SoM uses a LOT of VRAM at higher settings; recommended 4GB at High Textures. Which means the game is going to start being memory bandwidth bound and VRAM size bound, rather then bound by shader performance. And that favors AMD cards.
Make no mistake, we’re going to start seeing cards with 8GB VRAM soon. Not this generation, but next gen? I wouldn’t be shocked.
So basicly the game seems to
So basicly the game seems to be a good port except the dam ultra vram requirments..I mean the graphics are good but jesus 4gb on ultra….thats bf 4 in 4k…bit much clearly this game was developed on a gtx titan. It seems that vram is limiting factor nowadays so if u bought a 780ti a 500 quid gpu at the time u mays no longer becable to run max anymore……o well the joys of pc gaming.
🙁
That’s a matter of
That’s a matter of perspective. You could look at medhigh as what the developer intended the game to look like, and view ultra as a bonus for those with excess resources.
Guess I’ll just have to
Guess I’ll just have to *settle* for the up to 100fps on PC, versus locked 30fps on either console.
2160p on a single card is
2160p on a single card is better than I expected. When I get through my backlog to this game, I may use DSR with it since it doesn’t seem to have any AA options.
Yes i don’t agree with the
Yes i don’t agree with the 4gb vid ram requirement!!!
i think if you would want to sell your game you would at least stick to mainstream system requirements “WOW for example”
4GB VRAM on HIGH. More on
4GB VRAM on HIGH. More on Ultra (6GB Recommended). Less on Medium and Low, obviously. You set the settings based on your HW.
Ok, I had to benchmark my rig
Ok, I had to benchmark my rig using the utility in game.
I selected ultra settings at 1440P with the HD content installed.
I averaged 64.29 FPS with 2 GTX760 4gb in SLI. 16gb of ram on a 4770K.
SLI isn’t working yet, so you
SLI isn't working yet, so you are getting a single GPU result essentially.
SLI does work if you force it
SLI does work if you force it in The Nvidia control panel as noted above. Prior to doing this I was getting an average FPS in the 30s which seems reasonable for a 760 at 1440p with all the eye candy on.
According to GpuZ, (don’t
According to GpuZ, (don’t know how accurate it is)my gtx 980 is only using 3200MB or RAM to run the benchmark with all ultra including textures at 1440p. Of course that’s just the benchmark, I haven’t played the game since I’m still working through Watch Dogs.
Btw, Watch Dogs has also been in the news for vRAM usage, and according to gpuz it needs 2800MB for high textures and 3600-3800 for ultra at 1440p.
Feel bad for anyone who fell
Feel bad for anyone who fell for the 970/980 hype. Two of the newest games out favor AMD (sniper/mordor). Taking the recent AMD price cuts and newer drivers into consideration, the 970/980 are simply overpriced now with mediocre performance increases over the R9 series in a couple LAST-GEN games to show for.
I benchmarked my rather
I benchmarked my rather mediocre rig, a Phenom II 965@ 4ghz, 8 GB of ram and an overclocked 7850 2GB and got 30 FPS average at 1440p on the very high preset. Gpu-z reported 2082MB dynamic vram in use, so a bit of pci-e memory swapping going on, expected more usage considering they suggest 3GB of Vram and I think that’s for 1080p.
I get 55 fps average with the same settings with the exception of the textures being on medium and the motion blur being set to camera only.
4K is stutter heaven for me
4K is stutter heaven for me on my R9-290 someone else also mentioned the vram is not enough
http://www.hardwarepal.com/shadow-mordor-benchmark/8/
The AMD cards have much
The AMD cards have much better compute power than the NV one’s. The more geometry and lighting you need the more copmpute you need. Shadow of Mordor is the latest offering so could use lots of lighting and geometry.
In one of the frame rate it’s 35 for 980 and like 40 for AMD. Which is in line with 15% more compute power vs 30% more bandwidth from AMD so it looks like it’s more compute power related than texture fetching.
I’m writing this very fast but that’s about it.