COD: Advanced Warfare and Closing Thoughts

Our results from Call of Duty: Advanced Warfare offer interest in a different way. We used the opening level of the game for our testing and I found that just about any reasonable settings in the game would go above the 3.5 GB memory usage, even on the GTX 970. However, the GTX 980 continually used 300-400 MB more than the GTX 970, even on the same kinds of settings. Take a look:

We are running this game at 2560×1440, but keep in mind there is no such thing as "Very High" or "Ultra" settings in Advanced Warfare. Let's explain what we see. The "Very High" settings run set texture resolution, normal map resolution and specular map resolution to High, ambient occlusion if Off and post-process anti-aliasing is Off as well. Even with those settings, GTX 970 is pushing the 3.6GB level of graphics memory consumption, crossing the 3.5 GB barrier and toeing into the 500MB window. The problem is that the GTX 980 is using the full 3.97 GB of its memory for the same combination of settings.

At Ultra settings I was SURE we would get the GTX 970 into the 3.9 GB range, where we have everything set to its maximum, AA moved to MSAA 2x and even enabled Supersampling at 2x and 4x (both were tried). But the GTX 970 still only reported using 3.6GB of memory and the GTX 980 actually stretched just beyond the 4.0GB level.

What does this mean? I'm still not sure but it would seem that in this case the GeForce GTX 970 was having difficulty going beyond the 3.5GB limit. After talking with NVIDIA, technical PR said they weren't able to reproduce the behavior but our results were run 4-5 times in order to double check this was indeed the reported consumption we were seeing. The GTX 980 had no problems filling the 4GB of memory – the GTX 970 did.

Our two quality settings have a dramatic difference in performance outlook, with the "Very High" settings never dropping below 100 FPS and the "Ultra" settings hovering around the 20 FPS mark.

Looking at our graph of frame rate by percentile you'll see that performance deltas between the two settings and between the two cards remain similar. With an average of 155 FPS on the GTX 980 and 140 FPS on the GTX 970 on our "Very High" settings, we see 10-11% performance advantage for the NVIDIA flagship. Under "Ultra" the averages sit at 22 FPS and 19.5 FPS respectively, a 12% difference.

The frame time graph is an inverse of the FPS graphs above – the lower two lines in this graph represent the HIGH frame rate results of the "Very High" settings and the upper two lines represent the "Ultra" settings. And those top two results are the most important – you can see that indeed there are some additional spikes in the frame times of Advanced Warfare when running on the GTX 970 at these very intense image quality settings. Those spikes are nearly non-existent on the GTX 980, the card that is using 300-400MB more memory during our testing in that scenario.

That data is represented in our frame variance results as well – the GTX 970 at "Ultra" settings are pushing 5ms or higher variances for the last 10% of frames. The GTX 980 runs at less than half of that – 2.5ms or so.

Closing Thoughts

I spent nearly the entirety of two days testing the GeForce GTX 970 and trying to replicate some of the consumer complaints centered around the memory issue we discussed all week. I would say my results are more open ended than I expected. In both BF4 and in CoD: Advanced Warfare I was able to find performance settings that indicated the GTX 970 was more apt to stutter than the GTX 980. In both cases, the in-game settings were exceptionally high, going in the sub-25 FPS range and those just aren't realistic. A PC gamer isn't going to run at those frame rates on purpose and thus I can't quite convince myself to get upset about it.

Interestingly, Battlefield 4 and Call of Duty reacted very differently in my quest to utilize more memory with the GTX 970. BF4 had no issues scaling up memory usage on both cards, beyond the 3.5GB pool and into the secondary 0.5GB pool. Advanced Warfare…not so much. Despite all my attempts and changing settings and selecting different combination, using more than 3.6GB on the GTX 970 seemed impossible, despite the GTX 980 ramping up to 4.0GB without issue. It is possible, and seemingly likely, that there would be SOME combo of settings that would do it but based on the heuristics of Windows and the NVIDIA driver, that secondary pool of memory was being used only minimally. And though performance didn't seem to be adversely affected at the lower IQ settings, it did on the more intense ones.

GM204 won't look the same for us anymore…

I know that several readers have asked about SLI testing with the GTX 970 – that is a fair question and something I would like to do. The problem is that it introduces another factor of variables to the equation: we already know that SLI and CrossFire can introduce more stutter and variance in gaming at high resolutions. And determining which factor, the 3.5GB memory pool or multi-GPU complications, is causing a particular stutter or higher variance will be MUCH more difficult. Multi-GPU results already tend to be more variable from run to run, making that direction much tougher. Not impossible, just expect a much larger time investment to come to any kind of reasonably accurate conclusions.

And speaking of conclusions…what am I prepared to declare after our testing? First, it appears that the division of memory on the GeForce GTX 970 is definitely causing some games to act differently than they do with the 4GB of the GeForce GTX 980. Call of Duty: Advanced Warfare is evidence enough of that and I have no doubt that some other games exhibit the same behavior. Secondly, though I am not confident enough in my results that show higher variance with the GTX 970 than with the GTX 980 at extremely high image quality settings to state that the memory division is the cause, I think the results in both BF4 and COD leave the door open for it as a probable cause. Essentially, while I don't believe the performance results and experiences I saw are proof that this is occurring, it does mean that I cannot disprove it either.

Benchmark result courtesy HardwareCanucks

That may not be the answer anyone wants – consumers, gamers, NVIDIA, etc. – but it actually melds with where I thought this whole process would fall. Others in the media that I know and trust, including HardwareCanucks and Guru3D, have shown similar benchmarks and come to similar conclusions. Is it possible that the 3.5GB/0.5GB memory pools are causing issues with games today at very specific settings and resolutions? Yes. Is it possible that it might do so for more games in the future? Yes. Do I think it is likely that most gamers will come across those cases? I honestly do not.

If you are an owner of a GTX 970, I totally understand the feelings of betrayal, but realistically I don't see many people with access to a wide array of different GPU hardware changing their opinion on the product itself. NVIDIA definitely screwed up with the release of the GeForce GTX 970 – good communication is important for any relationship, including producer to consumer. However, they just might have built a product that can withstand a PR disaster like this.

« PreviousNext »