COD: Advanced Warfare and Closing Thoughts
Just like I saw last week with the GTX 970 cards running in SLI, I did not see the cards extend past the 3.65GB usage mark. Meanwhile, the GTX 980s shot up to 3.9GB and even 4.0GB usage without much work.
As users expected, once you add in a second card to each of these platforms the "Ultra" settings frame rates pull into the playable realm, hovering around the 50 FPS mark. But how do the two configurations compare in user experience?
Looking at just the average frame rates, estimated by the 50th percentile on the left side of this graph, the pair of GTX 980s in SLI are only 1-2 FPS faster than the two GTX 970s!
But there is a lot more going on than just the average frame rate as this frame times graph, can demonstrate. The dark green and blue bars representing the longer frame times of the "Ultra" preset both show a lot of spikes and variance, though the GTX 970 SLI results show more of it – anywhere the dark green is not "hiding" the blue behind it.
The graph measuring frame variance actually doesn't show as much of a gap here between the GTX 980 SLI and GTX 970 SLI configurations as I expected. I can assure you that the gaming experience of the 970s was substantially lower than that of the GTX 980 cards. There was noticeable hitching and doing a smooth pan around a central location was much nearly impossible on the lower cost cards. The GTX 980 SLI setup produced a much better overall result, more in-line with the average frame rates being reported.
Compared to last week's article that looked at the GeForce GTX 970 in a single GPU configuration, the SLI results show more variance differences between NVIDIA's top two flagship products. In a vacuum, this wouldn't mean much – we expected there to be more complications and frame to frame variance when looking at multi-GPU configurations. That has always been the case regardless of how well tuned NVIDIA or AMD drivers get.
But under the light of the recent memory issues surrounding the GeForce GTX 970 we have to look at things a bit differently. Yes the GTX 970 has fewer shader cores and runs at slightly lower clocks than the GTX 980, but would we expect to see similar levels of frame rate consistency between it and the GTX 980 in these games, at these settings when running paired up in SLI? My initial inclination would be yes.
So the question sits before us: does the ROP count / L2 cache size difference that was revealed last week by NVIDIA account for the frame variance differences between the GTX 970 and the GTX 980 cards in SLI? More than likely: yes. And that may end the discussion for many of you but consider this last point. The largest difference in variance on BF4, the primary example of this showing up for me in data, was run at 3840×2160 and 150% pixel scaling. That is essentially running BF4 at 6K!
The Call of Duty Advanced Warfare results occurred at 2560×1440, a much more reasonable resolution. In that case the data didn't show a gap in experience between the GTX 970 SLI and GTX 980 SLI configurations, but my time playing it certainly did. The GTX 970 SLI setup result in more hitches and frame rate drops than the GTX 980 cards but again, we were pushing every setting to its absolute maximum, even going as far as enabling 2x supersampling.
My takeaway from today's testing is that users with or looking at an SLI setup of GeForce GTX 970 cards appear to be more likely to run into cases where the memory pools of 3.5GB and 0.5GB will matter. Because you are able to reach higher playable frame rates with two GTX 970s than just a single one, stretching out and increasing image quality settings is more common. If you are that type of gamer, looking to stretch the boundaries what settings are playable, then it is worth a warning about the differences between the GTX 970 and the GTX 980.
For the others out there, the GeForce GTX 970 remains in the same performance window it was at prior to this memory issue and specification revelation. For $329 it offers a tremendous level of performance, some amazing capabilities courtesy of the Maxwell GPU and runs incredibly efficient at the same time.
AMD is aware of this debate as well and it should surprise no one to see the Radeon R9 290X sitting at $299 after rebates from several add-in card partners. Let the GPU wars continue!