UPDATE: We have put together a much more comprehensive story based on the NVIDIA 337.50 driver that includes more cards and more games while also disputing the Total War: Rome II results seen here. Be sure to read it!!
When I spoke with NVIDIA after the announcement of DirectX 12 at GDC this past March, a lot of the discussion centered around a pending driver release that promised impressive performance advances with current DX11 hardware and DX11 games.
What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11. When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.
NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.
Lofty goals to be sure. This driver was released last week and I immediately wanted to test and verify many of these claims. However, a certain other graphics project kept me occupied most of the week and then a short jaunt to Dallas kept me from the task until yesterday.
To be clear, I am planning to look at several more games and card configurations next week, but I thought it was worth sharing our first set of results. The test bed in use is the same as our standard GPU reviews.
|Test System Setup|
|CPU||Intel Core i7-3960X Sandy Bridge-E|
|Motherboard||ASUS P9X79 Deluxe|
|Memory||Corsair Dominator DDR3-1600 16GB|
|Hard Drive||OCZ Agility 4 256GB SSD|
|Graphics Card||NVIDIA GeForce GTX 780 Ti 3GB
NVIDIA GeForce GTX 770 2GB
|Graphics Drivers||NVIDIA: 335.23 WHQL, 337.50 Beta|
|Power Supply||Corsair AX1200i|
|Operating System||Windows 8 Pro x64|
The most interesting claims from NVIDIA were spikes as high as 70%+ in Total War: Rome II, so I decided to start there.
First up, let's take a look at the GTX 780 Ti SLI results, the flagship gaming card from NVIDIA.
With this title, running at the Extreme preset, jumps from an average frame rate of 59 FPS to 88 FPS, an increase of 48%! Frame rate variance does increase a bit with the faster average frame rate but it stays within limits of smoothness, but barely.
Next up, the GeForce GTX 770 SLI results.
Results here are even more impressive as the pair of GeForce GTX 770 cards running in SLI jump from 29.5 average FPS to 51 FPS, an increase of 72%!! Even better, this occurs without any kind of frame rate variance increase and in fact, the blue line of the 337.50 driver is actually performing better in that perspective.
All of these tests were run with the latest patch on Total War: Rome II and I did specifically ask NVIDIA if there were any differences in the SLI profiles between these two drivers for this game. I was told absolutely not – this just happens to be the poster child example of changes NVIDIA has made with this DX11 efficiency push.
Of course, not all games are going to see performance improvements like this, or even improvements that are measurable at all. Just as we have seen with other driver enhancements over the years, different hardware configurations, image quality settings and even scenes used to test each game will shift the deltas considerably. I can tell you already that based on some results I have (but am holding for my story next week) performance improvements in other games are ranging from <5% up to 35%+. While those aren't reaching the 72% level we saw in Total War: Rome II above, these kinds of experience changes with driver updates are impressive to see.
Even though we are likely looking at the "best case" for NVIDIA's 337.50 driver changes with the Rome II results here, clearly there is merit behind what the company is pushing. We'll have more results next week!
It cracks me up when a former
It cracks me up when a former AMD fan site (do you know what the original name of PCPer was?) is called an Nvidia fan site.
Guys get a grip and wait for the full review.
StarSwarm looks good on 780TI also, isn’t this a Mantle game? 🙂 Note it’s ONE card, and look at the change, same with others. 780ti went from 55 to 70fps which allows it to beat MANTLE as shown in the graph. Kind of defeats all the whining over SLI crap here. Gaining close to 30% from a driver in a game AMD should dominate, causing you to be tops, is a pretty clear victory for DX11 IMHO. Note you can see Mantle’s effects too, as AMD’s regular driver sucks as shown. Mantle gave it a HUGE boost (from 32fps to 57fps). But NV that apparently won’t be enough to stop NV’s new drivers.
I’d rather have better drivers affecting all cards and all games than one api for a few cards from one vendor who happens to have 1/3 of the market vs the other guy who has 2/3. The quicker that API dies the better. It is merely taking resources away from what they SHOULD have been spending on.
Those are single gpu scores, and all are better than you can get from raising speeds from 1006 to 1019. Having said that, not really impressed they can get this, when it is because of AMD basically taking a year and a half to catch them that they held back FULL GK110, no driver improvements for an entire year until AMD released NEVER SETTLE drivers in NOV 2012. As hardocp showed reviewing the drivers (a few 3/2013 I think or so), NV hadn’t done ANYTHING until Never Settle came. Why would they? Wait for competition to catch you before revealing your answer (even if you’ve had that answer the entire time, I’d keep it until you caught me). This is good business, and how you make profits maximize (which is a business goal correct?).
CPU with 12 threads.
CPU with 12 threads.
Faster and much more expensive gpu used anyway.
Aggressive driver optimizations specifically for that title.
A little(too much) marketing in it.
Successful with loyal fanboys. They think that they are already running DX12.
You are the fanatic and loyal
You are the fanatic and loyal fanboy (a rabid AMD fanboy, of course), making his day with this shit of your mouth (speaking about things that you don’ understand or see for yourself).
You don’t know anything about this driver, the better performance is in ALL DX11 games (great or little improvements, it depends of the cpu usage of the API DX11 in each game), aren’t specific title optimizations:
It´s a very obvious set of optimizations in the cpu usage of the driver, because it makes the best with the weaker cpu configuration, and it’s a general optimization because spreads along many DX11 games that aren’t in the driver changelog of 337.50.
So the addition of a sLi
So the addition of a sLi profile is now being sold as dx11 overhead reduction now?
Wow, just wow!
I got a GTX 750 Ti and I
I got a GTX 750 Ti and I updated to 337.50 Beta at the same time. I keep getting total computer crashes while gaming a couple times here and there. Probably because it’s beta, we’ll see. I am VERY pleased with the performance I do get though. I get a very smooth 38-40 fps in Skyrim with ENB with no DOF. Great performance in all my games.
This driver wouldn’t really
This driver wouldn’t really have anything to do with that. Skyrim is a DX9 game and the the 337.50 driver “improves” DX11. Unless there are undocumented improvements for the Maxwell architecture, both drivers would perform almost identically.