Results – 1080p, Ultra
We'll start with the 1920×1080, Ultra settings preset results.
The combined score for the Fable Legends benchmark is simply the average frame rate of the whole test multiplied by 100, but the larger number helps make writing about and talking about comparisons a bit easier. The GTX 980 Ti comes in with our best score, 8089, a full 5.7% faster than the Fury X from AMD. The GTX 980 does fall behind the R9 390X though by nearly the same margin (5.2%) and the R9 380 is 7.3% faster than the GeForce GTX 960. NVIDIA once again takes the lead with our lowest cost GPUs tested today with the GTX 950 running 10% faster than the R7 370.
You may notice that our average FPS scores in this graph do not perfectly match the scores above. Rather than simply take the combined score above and divide by 100, these averages are computed using the frame times provided by the CSV output files from the benchmark runs themselves. They are all REALLY close though and the results mirror what we showed you above.
Since we have access to the frame times it seemed pertinent to provide the 95th percentile frame times (not frame rates) to see if anything changes in terms of GPU to GPU comparisons. The AMD Radeon R9 Fury X looks like it closed the gap a little bit here, indicating that the GTX 980 Ti has a bit more variance in its highest 5% of frame times. The GTX 960 / R9 380 comparison also leans a bit more in favor of the Radeon hardware when looking at frame times in this manner.
There is a lot of data here but at least a handful of results that piqued my interest. First, the performance of the Global Illumination pass is consistently faster on the GeForce GTX GPUs when compared to the similar segment Radeon. The GTX 980 Ti is more than 2x faster than the Fury X in that particular step, for example, though AMD's Fiji GPU has the edge when we look at standard shadow mapping and direct lighting.
The Radeon R9 380 had an interesting goose egg (score of 0) for the "other" results which might make the data a bit misleading; that's also the only instance where the AMD GPU is not faster than the NVIDIA GPU in dynamic lighting.
And now for something a little different. I have provided for you below a collection of four frame time comparison graphs (GTX 980 Ti vs. R9 Fury X, etc.) that we like to provide in these kind of performance analysis articles. However, we are working on something brand new here at PC Perspective – dynamic and interactive graphs – and I thought today's story is as good of a place to test it as any.
So, if you want the standard frame time, static graphs, simply scroll down to view the AMD vs GPU comparison as you are used to seeing them. However, if you want to try out something new, click here:
Tips:
- Click names of GPUs in top legend to add/remove them from the graph
- Highlight a portion (or pinch/zoom) of the graph to zoom in
- Hold shift and click/drag (or touch drag) to pan in zoomed view
Play around with that and let me know what your thoughts are on the presentation in the comments below!
At 1080p, the top four performing cards show very consistent frame times and very little variance or peaks. As you move to lower performing hardware though you start to see some instances of longer frame times and more variant frame times in succession – a sort of back and forth oscillation. With average frame rates for the GTX 950 and R7 370 hovering in the 30-35 FPS range, we are clearly seeing the GPU workload of Fable Legends at 1080p/Ultra pushing these cards boundaries.
I see this game being more of
I see this game being more of a PR thing for Microsoft and DX12 with just a few token attempts at using DX12 features that are largely inconsequential. In essence it’s just another UE4 game, and if Lionhead has done any significant optimization effort, it’s gone to the Xbox One version. The lack of DX11 option should be enough to tell that there’s probably no major benefits since the PC port is not bottlenecked by draw calls or utilizing things like asynchronous shaders. According to UE4 documentation AS support was added by Lionhead Studios to he Xbox version, with no support for PC.
Still, it’s a pretty looking game and probably representative of a lot of UE4 titles in the near future. A lot of games like these weren’t really being bottlenecked by DX11 in the first place. I’m guessing most of the early DX12 titles will have this sort of minor tweaks and the devs are just testing things out, and it’ll take a year or two before we start seeing engines make real use out of DX12, especially on PC.
Luddites!
Luddites!
Yep 100% agree
Yep 100% agree
Async Compute is NOT utilized
Async Compute is NOT utilized in current UE4 engine, except for the Xbox One.
Ref: https://docs.unrealengine.com/latest/INT/Programming/Rendering/ShaderDevelopment/AsyncCompute/index.html
This makes the r9-290x look
This makes the r9-290x look like a monster value if more dx12 title turn out this way.
Less then Half the price of a GTX 980, and faster at 1080p and 4K
(and better frame times)
Other sites did test the GTX 970, and its not looking good kids…
Many people really overpaid for much less capable HW then hawaii.
(but at least its rocks in dx11 titles)
Ryan, I suppose the benchmark
Ryan, I suppose the benchmark does not support SLI or crossfire or you would have tested them?
LOVE the interactive graphs
LOVE the interactive graphs
This is definitely a better
This is definitely a better benchmark to judge future performance of DX12. Fable Legends uses Unreal Engine 4. Hundreds of last gen games were made with Unreal Engine 3, from Bioshock Infinite, Gears of War, Hawken, Borderlands, XCOM, etc. And probably hundreds more games will be made with Unreal Engine 4, which has DX12 support built in. The Ashes of the Singularity game engine isn’t likely to be widely used beyond a few games, so this seems to be a much better indicator of future DX12 performance in most games. Also Unreal Engine 4 is very indie friendly, so this should apply to more than just AAA games.
UE4 is FREE!
UE4 is FREE!
Free!*
*You pay a 5 percent
Free!*
*You pay a 5 percent royalty on gross revenue after the first $3,000 per product, per quarter.
“It turns out that the game
“It turns out that the game will have a fall-back DX11 mode that will be enabled if the game detects a GPU incapable of running DX12.”
Give me DX11 VS DX12, now!!!
Any update on whether the
Any update on whether the catalyst driver 15.9.1 actually improve the performance of the AMD cards??