SLI Testing
NVIDIA’s 337.50 driver has caused quite the stir. Is it worth it though?
Let's see if I can start this story without sounding too much like a broken record when compared to the news post I wrote late last week on the subject of NVIDIA's new 337.50 driver. In March, while attending the Game Developer's Conference to learn about the upcoming DirectX 12 API, I sat down with NVIDIA to talk about changes coming to its graphics driver that would affect current users with shipping DX9, DX10 and DX11 games.
As I wrote then:
What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11. When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.
NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.
In truth, this is something that both NVIDIA and AMD have likely been doing all along but NVIDIA has renewed purpose with the pressure that AMD's Mantle has placed on them, at least from a marketing and PR point of view. It turns out that the driver that starts to implement all of these efficiency changes is the recent 337.50 release and on Friday I wrote up a short story that tested a particularly good example of the performance changes, Total War: Rome II, with a promise to follow up this week with additional hardware and games. (As it turns out, results from Rome II are…an interesting story. More on that on the next page.)
Today I will be looking at seemingly random collection of gaming titles, running on some reconfigured test bed we had in the office in an attempt to get some idea of the overall robustness of the 337.50 driver and its advantages over the 335.23 release that came before it. Does NVIDIA have solid ground to stand on when it comes to the capabilities of current APIs over what AMD is offering today?
In an ideal world, we would have a dozen editors that are able to run an assortment of hardware and software tests on these two drivers (335.23 and 337.50) to give you 100% of the available data. I do not have those editors, and instead relied on me and my weekend to gather the data you see here. I have included graphics cards that range from the extreme enthusiast level (GTX 780 Ti) to the high-end (GTX 770) and even mainstream (GTX 750 Ti). Most tests were run on a Core i7-3960X Sandy Bridge-E platform though I did run a handful on the Core i7-4770K + Z87 platform with the GTX 780 Ti cards in SLI for sanity.
Test System Setup | |
CPU | Intel Core i7-3960X Sandy Bridge-E Intel Core i7-4770K Haswell |
Motherboard | ASUS P9X79 Deluxe ASUS Z87 Pro |
Memory | Corsair Dominator DDR3-1600 16GB |
Hard Drive | OCZ Agility 4 256GB SSD |
Sound Card | On-board |
Graphics Card | NVIDIA GeForce GTX 780 Ti 3GB NVIDIA GeForce GTX 770 2GB NVIDIA GeForce GTX 750 Ti 2GB |
Graphics Drivers | NVIDIA: 335.23 WHQL, 337.50 Beta |
Power Supply | Corsair AX1200i |
Operating System | Windows 8 Pro x64 |
The games used are also somewhat varied as I attempted to find cases where the NVIDIA 337.50 driver excelled while also giving the other side of the story with games that see little to no benefit to the changes. Tested titles include Batman: Arkham Origins, Battlefield 4, Crysis 3, Hitman: Absolution, Bioshock Infinite, Borderlands 2, Sniper Elite V2, Call of Duty: Ghosts, Sleeping Dogs, Metro: Last Light and GRID 2. Not every game was tested with every hardware configuration, but we'll explain as we go.
Our first page of results is going to focus on multi-GPU performance between the two drivers, starting with the GeForce GTX 780 Ti on the Core i7-3960X + X79 platform.
With the high-end processor and graphics card combination, the biggest performance gains were seen with Hitman: Absolution and Batman: Arkham Origins.
Our Hitman testing scaled a healthy 30% at 1080p and Ultra settings while at 2560×1440 we saw scaling at 36%! Batman benchmarks actually showed a better gain at 2560×1440 than at 1920×1080, hitting 5.2% in the better result. Battlefield 4 and Crysis 3 saw less than a 2% frame rate increase.
Next, we took the same GPU configuration and plugged it into a different platform, this time based on the Core i7-4770K. Call of Duty: Ghosts scores increased by a significant amount, hitting 21% at 1920×1080 on the Max preset and with 4xMSAA enabled.
Bioshock Infinite at 25×14 was able to improve by more than 6% with the 337.50 driver and Sniper Elite V2 improved by 4.39% at 1920×1080 on its highest quality settings. Hitman: Absolution again sees impressive increases in frame rate, a jump of 35% at 1920×1080 on the Ultra preset.
Even Thief, a game that AMD has been using to help promote Mantle, sees a sizeable performance gain in SLI with a pair of GTX 780 Ti cards at 1920×1080 on the Very High preset. A scaling rate of 31% is close to the performance gains we have seen with AMD's Mantle API on its Radeon graphics cards, but NVIDIA was able to do this inside the existing DX11 API.
Our third set of SLI results moves back to the Core i7-3960X platform but replaces the GeForce GTX 780 Ti cards with GeForce GTX 770 models.
These cards retail for close to $330, compared to the $699 of the flagship offerings, which should give us a better look at configurations that more of our readers likely have in their PCs.
This time the biggest scaling is seen with Sleeping Dogs, an older DX11 title that still impresses visually. Improvements over 22% at 1920×1080 and 14% at 2560×1440 are definitely worthwhile performance advantages for a single driver release and indicate there is some magic under the hood of 337.50. Batman: Arkham Origins scales by about 3.5% at both 1080p and 25×14, not nearly as big of an improvement but it is something.
On the next page we'll take a look at a couple of single GPU configurations and compare 337.50 and 335.23 driver results.
I don’t think that the
I don’t think that the explanation from Nvidia is correct and that the new driver does reduce the CPU load.
Other sites have done test directly focusing on Nvidias claims (that the new driver does reduce the draw calls or at least the load created by them) by testing mainly in CPU bottleneck scenarios. One example was testing modern games in 1280 x 720 with a Core 2 Quad and a GTX 780 Ti – the results were interesting, because the framerates did only increase by less than 3%, in some cases they even decreased.
Using the same testing conditions, but a i7-3770K instead of the Core 2 Quad, the increase in fps became more noticeable, but still not nowhere near Nvidia’s claim and still most games did not gain any frames from the new drivers.
Seems like their tweaks work
Seems like their tweaks work best when the CPU is a moderate CPU bottleneck. If you’re using an ancient CPU and severely bottlenecked, even zombie baby Jesus himself can’t magically turn that into a $1k Core i7.
Greetings Ryan,
On Batman: AO
Greetings Ryan,
On Batman: AO (19×10 – Max) with 2 GTX 780Ti in SLI changing the 4770K with a 3960X will produce 72 more fps !?!?
Are you sure about this ?…
So this is just a perfectly
So this is just a perfectly normal driver update, doing what perfectly normal driver updates do (improving a few games hotspots here and there) + a slight higher temp to get a slight 1-2% across the board because hell, you never know.
All this surrounded with bullshit slides, and open lies to draw attention away from what is actually a real thing.
All this with the help of websites blinded by their “nVidia can’t lie to us because they’re so great ppl” mantra.
Well they lied, and disrespected ppl even more than with the GTX 680, 780 and Titan rip offs / jokes.
Again,
Update the
Again,
Update the https://pcper.com/news/Graphics-Cards/NVIDIA-GeForce-Driver-33750-Early-Results-are-Impressive article with the newfound info, Ryan. Just add a line and a link for this article.
I added it this morning!
I added it this morning!
I’ve done a simple FarCry 2, 4gb DDR2, Win7SP1-64bit.
I’ve done a simple FarCry 2 benchmarks on my PC – Core2duo E4300, GF 8600GT
Settings: Demo(Ranch Small), 1280×800 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(None), VSync(No), Overall Quality(Custom), Vegetation(High), Shading(High), Terrain(High), Geometry(High), Post FX(High), Texture(Medium), Shadow(High), Ambient(High), Hdr(Yes), Bloom(No), Fire(Medium), Physics(Medium), RealTrees(Medium)
Results:
Average Framerate:
335.23 – 28,93
337.50 beta – 29,24
Reading forums like this
Reading forums like this makes me wonder if some people just need to get a life.
PC Per already explained the mistake about the DX11 improvements versus the SLI Profile on Total Rome 2… they retracted their previous story and posted an update explaining the circumstance. This is what I would expect from a professional journalist. Even large news organizations occasionally make a mistake and they have MUCH larger research staffs than any site about hard ware news will ever be able to afford.
And yet you see crap like this… “still sad though for pcper sale out to nvidia’s propaganda”.
Seriously?
How on earth does anyone reach this conclusion from this situation? And yet I see this kind of outrage about tiny, basically irrelevant details all the time.
Keep up the good work PC Per, despite these people that apparently need to find new hobbies.