Battlefield 4 Results
The GTX 970 memory issue led our readers to request testing of the cards paired up in SLI. Here you go.
At the end of my first Frame Rating evaluation of the GTX 970 after the discovery of the memory architecture issue, I proposed the idea that SLI testing would need to be done to come to a more concrete conclusion on the entire debate. It seems that our readers and the community at large agreed with us in this instance, repeatedly asking for those results in the comments of the story. After spending the better part of a full day running and re-running SLI results on a pair of GeForce GTX 970 and GTX 980 cards, we have the answers you're looking for.
Today's story is going to be short on details and long on data, so if you want the full back story on what is going on why we are taking a specific look at the GTX 970 in this capacity, read here:
- Part 1: NVIDIA issues initial statement
- Part 2: Full GTX 970 memory architecture disclosed
- Part 3: Frame Rating: GTX 970 vs GTX 980
- Part 4: Frame Rating: GTX 970 SLI vs GTX 980 SLI (what you are reading now)
Okay, are we good now? Let's dive into the first set of results in Battlefield 4.
Battlefield 4 Results
Just as I did with the first GTX 970 performance testing article, I tested Battlefield 4 at 3840×2160 (4K) and utilized the game's ability to linearly scale resolution to help me increase GPU memory allocation. In the game settings you can change that scaling option by a percentage: I went from 110% to 150% in 10% increments, increasing the load on the GPU with each step.
Memory allocation between the two SLI configurations was similar, but not as perfectly aligned with each other as we saw with our single GPU testing.
In a couple of cases, at 120% and 130% scaling, the GTX 970 cards in SLI are actually each using more memory than the GTX 980 cards. That difference is only ~100MB but that delta was not present at all in the single GPU testing.
Our performance data is being broken up into two sets: the GTX 980s in SLI running at all five of our scaling settings and, separately, the GTX 970s in SLI running on the same five scaling settings. Plotting 10 sets of data on a single graph proved to be a a bit too crowded, so we'll show the graphs successively to help you compare them more easily.
Unlike our first sets of results the SLI numbers are ALMOST in a playable state, making them much more real-world than before. The first thing I noticed when compiling this data was that the GTX 980 cards in SLI actually had a couple of more downward spikes in frame rate at 150% scaling than the GTX 970s did. I did confirm this was a regular pattern by re-running tests on both sets of hardware about six times, and the bright green line you see in the first graph above is actually one of the better results for the 980s.
It appears though that moving from 110% to 150% scaling results in the expected frame rate decreases in both configurations.
Average frame rates are where we expect them to be: the GTX 980 SLI is faster than the GTX 970 SLI by fairly regular margins.
GTX 980 | GTX 970 | % Difference | |
---|---|---|---|
1.10x Scaling | 47.3 FPS | 41.0 FPS | -15% |
1.20x Scaling | 41.1 FPS | 35.8 FPS | -15% |
1.30x Scaling | 35.4 FPS | 31.2 FPS | -13% |
1.40x Scaling | 31.0 FPS | 27.7 FPS | -12% |
1.50x Scaling | 27.7 FPS | 24.6 FPS | -13% |
The percentage differences here are actually more reliable than the single GPU results, which is a bit of a surprise to us. The GTX 970s in SLI range from 12-15% slower than the GTX 980s in SLI, but as we know from our years of GPU evaluation, that isn't the whole story.
Click to Enlarge
It should be painfully obvious that at 150% scaling the GTX 970s in SLI have a significant amount of frame time variance that is mostly limited with the GTX 980s in SLI. Even at 140% scaling, looking at the thinner gray line, you can see differences in the behavior of the frame time lines in the two graphs (you can click to enlarge them for a closer view).
Before you start eyeballing these graphs, make sure you take note of the slightly different y-axis on the left hand side between them - that's important. Focusing on the highest 10% of frame times in our testing, the GTX 970s clearly exhibit more issues than the GTX 980s. The flagship cards only see about 5ms of variance at the 90th percentile and crosses the 20ms mark somewhere in the 96-97th percentile. The GTX 970s in SLI though reach those marks much sooner - at the 90th percentile we see as much as 18ms frame variance and at the 96th percentile that reaches as high as ~40ms.
to remove stutter by g-sync!
to remove stutter by g-sync!
sony vgp ac10v10 ac
sony vgp ac10v10 ac adaptor
lenovo ideapad miix 10 charger
Really nice articles Ryan
Really nice articles Ryan (all 4 parts), thanks.
I think that it could be really interesting to test GTX970 without(!) the 0.5GB ram portion; to test the effective contribution of this “near line” cache between the video memory and the system (pcie) memory and to verify if the constant struggle to keep memory under the 3.5 limit, the euristic shuffle of data between the two segments have an impact in frame rate consistency.
Just a thought.
I find it hilarious that this
I find it hilarious that this was some grand conspiracy on nvidia’s part. Not that the advertising dept screwed the pooch, besides as how the hardware and bridges are setup it is possible for it to use that extra .5GB everyone is throwing a tantrum over. knowing nvidia they will push out a driver update that better manages that .5GB
okay it doesn’t like using the extra .5GB, it might ‘feel’ that using less and keeping at the higher bus spd returns a better result. they’ve even stated that the card will attempt to avoid the .5GB sector.
The other thing I have seen is the ‘oh AMD has 8GB’ well why do they need so much memory? even running a title like Skyrim all textured out your only pushing around 4GB. So that makes me think they actually have problems handling frame buffers. Having an extra 4GB of memory isn’t going to radically increase the power needs of a card, thats just poor engineering.
And for the other AMD argument of that their cards don’t only come with twice the memory but cost less. well they cost less to purchase but cost you significantly more at the outlet and cooling your apartment or house.
For those who are going to accuse me of just being team green, I have used ATI cards as recently as HD7850 which was a XFX Black edition and another one made by Gigabyte which might be a HD4850.
There are cases already when
There are cases already when more then 4GB is benefitial: http://pclab.pl/zdjecia/artykuly/chaostheory/2015/01/970_bug/charts/mearth_2560_1_ms_b.png
It’s not poor engineering. It
It’s not poor engineering. It is marketing and an attempt to move an older GPU before new products are available. There is a benefit to 8GB in current games, but only at 8K resolution. Even at 4K the extra RAM is not necessary.
Advanced warfare with my dual
Advanced warfare with my dual 970’s on my 1440p is a laggy stuttery mess and this is the reason?
The tests show that it isn’t
The tests show that it isn’t an issue. There could be driver issues. What CPU are you using as your CPU needs to send 2X the commands and high numbers of drawcalls can kill you. Testing is often done on very high end rigs. So issues on a more normal gamer setup is missed.
I’d run DPC latency checker in windowed mode or run LatencyMon when it is happening. See if you have any latency spikes in windows. This will cause the game to stutter.
I think way too much time is
I think way too much time is being absorbed into this subject. Any new articles outside of the ring being worked on? I’m bored.
There is a reason so much
There is a reason so much time needs to be poored into this ignorant fools. There are people out there that bought 2 GTX970’s including me and ever since have issues with games that use alot of VRAM. 3.5GB or more on 2560x1440p with max settings and some AA.
Shadow of Mordor, Far Cry 4, Lords of the Fallen, Dying Light to name a few!!!
I’m force to run a single GTX970 to be able to play them smoothly FFS.
A whole month I thought its the games or SLI profiles that were shit, tweaking the game configs and using tools made for those specific games to make it run smoother, SLI bits tweaks NOTHING helped. Then this shit came up that the memory was a spit partition with a slower portion! Yeah that explained a shit ton and now tests have emerged proofing this, not only on this site but other European sites (Nordic ones, German ones) We in europe have far better tech site digging deeper into this shit and had all this already last week for GTX970 SLI configs and massive frametime variances.
Thanks … That’s what I
Thanks … That’s what I wanted …
So, THERE is a REAL problem in SLI when the game requests more than 3.5 !
So … Nvidia REALLY should start the refund process because the card IS NOT EVEN ABLE to perform what it stats to … SLI.
Since you can play with a single one, means that with two, we have a problem.
Sad … I Was looking to grab another one … Now I Wont … I Cant afford a 980 …
Nowhere is there any mention
Nowhere is there any mention of ‘not able to perform SLI’. I think you should put things more into perspective. If anything, the issue is much less significant than many are portraying it to be.
No disrespect PCPER. I still
No disrespect PCPER. I still like what you do on here and like the streams and podcasts. Keep up the good work!
I have the same doubt as
I have the same doubt as another user stated.
We all now that GTX 970 has a second memory module, but what we DON’T and NO ONE have stated it for us till now, is:
How much of “performance”, we lost when the card uses this second module ?
That’s point no one is seeing.
We all know that the 512MB memory allocation module is slower than the first one, but how much in “FPS” we lose because of it ???
THAT’S the concern we should worry about ( just talking about card performance, not the nVidia lies )
So, we HAVE or we DONT HAVE a stutter when the card uses more than 3.5 of VRAM in SLI ?
In SLI at 3.5GB and over you
In SLI at 3.5GB and over you get a bit of stutter and hitches and doesnt feel smooth from my expereinces with those games mentioned above at 2560x1440p maxed out. Framerates arent a concern for me and most I believe but its the overall experience of smoothness isnt there!
I saw it immediately going back to one GTX970 and made sure not to get close to 3.5GB VRAM usage in those games and the experience is smoothness all around
970 sli @ 3440×1440 far cry 4
970 sli @ 3440×1440 far cry 4 remains unpayable. Even with fps at 60+ most of the time the stutter is unbearable.
970 sli @ 2560×1440 with gsync advanced warfare stutters but is playable.
The above mentioned games are installed on an 840 evo lol. Coincidence??
I have no problems with shadow of mordor however and dying light is fine without Nvidia dof on. Both maxed at 3440×1440.
Search “840 evo stutter”.
Search “840 evo stutter”.
So, the GTX 970 SLi performs
So, the GTX 970 SLi performs worse than GTX 980 SLi and stutters more. Brilliant conclusion, Ryan.
What you failed to do was to give the GTX 970 SLi owners (and to the people that already have one GTX 970 and are considering to get a 2nd card), a comparison between the other card(s) in the same price point and with similar performance.
I already knew a GTX 980 SLi would run my games better than a GTX 970 SLi. However, what I still don’t know is if returning my GTX 970s to get a R9 290(X) Crossfire setup will be better for my gaming experience as far as frame-pacing is concerned.
I learnt nothing useful from this tests. I still don’t know if the stutter is due to a weaker chip or due to the memory system. I need to see more “weaker” chips tested in the exact same environment conditions to make a more informed decision.
I have to say I am very dissapointed. I really thought you (PCPer) were taking that extra step to inform your readers about their options.
Thanks, for nothing.
Man if you have zero
Man if you have zero analytical skills and and cant draw your own conclusion from this data dont blame Ryan.
How can the stutter be from a weaker chip when the stutter starts right after passing 3.5GB of VRAM usage ? If it was the chip it would be way more gradual. ‘
Use you head man !
The stutter is above 4GB and
The stutter is above 4GB and the 980 has it too. This shows that the frame variance is not overly linked to memory use, but rather load.
Nvidia guidelines for testing
Nvidia guidelines for testing sLi state that the test should be done in isolation? Looks like it. Considering the y axis is in 20ms stepping, that is some ugly ass performance.XDMA Xfire would put that to shame and completely own it.
So 5 months on, and you
So 5 months on, and you finally get around to testing SLI with FCAT. Do you wonder why everybody considers you an NV PR mouthpiece and shill? What took so long? We all know that there is no hesitation to get AMD cards’ frame latency tested ASAP. What was the motivation to omit SLI frame latency tests for the latest NV products? What were NV orders, and why should we consider you an unbiased and neutral computer hardware review site with consumer interest in mind?
Why do you bother coming here
Why do you bother coming here if you hate it so much?
To piss you off.
To piss you off.
Now you are what we would
Now you are what we would call ‘azijnzeiker’. I won’t even bother translating that for you.
Do you even noticed there’s
Do you even noticed there’s absolutely NOT a single AMD Crossfire frame-pacing benchmark of the same games (BF4 and CoD:AW), conducted under the same conditions (resolutions and settings), as the nVIDIA cards were tested, being submited in this article – or anywhere on PCPer’s website?
I wouldn’t mind to see AMD Crossfire’s frame-pacing results validating the current assumption that GTX 970’s stuttering is more resultant from the partitioned memory system than it is resultant from having less cuda cores than GTX 980.
OR do I put my “conspiracy hat” and conclude Ryan’s already knows that isn’t the case; that Ryan already knows AMD CF stutters as much as the GTX 970 SLi stutters under the same exotic conditions and is only protecting AMD (and AMD card’s), from being humiliated?
See how I could conclude Ryan’s an AMD “PR mouthpiece and shill”, just as easily as you?
I wish I knew what results and AMD R9 290(X) Crossfire produces under same conditions but, I think I’ll never know – at least not from PCPer 🙁
Cause THIS IS NOT A STORY
Cause THIS IS NOT A STORY ABOUT AMD. that is why there is no AMD results on this. There are plenty of test done using 970SLI vs 290x CF on pcper and all over the damn net if you learned how to use a damn search engine.
AMD has worse frame pacing
AMD has worse frame pacing and that is why the AMD “fans” who are pushing this don’t want that information out. They are demanding slanted and deceptive reporting.
Fair and balanced is is offensive to them. They demand that the press reflect their unreality. Their UFO tinfoil hat conspiracy thinking.
The results shows them to be wrong over and over. They demand more tests until one shows something that fits their story. Since none do they claim conspiracy and make up their own false evidence like the UFO people photoshopping in UFO’s into pictures.
AMD fans are certifiably crazy, as is AMD PR. They literally put out video’s with Nvidia breaking into peoples houses and trying smashing their AMD cards. They make false claims against devs who work with Nvidia and point to fake screenshots as evidence for their unfounded claims. They attempt to force the devs out of business unless they join their Gaming Evolved program. They then try to put malicious code into games to harm Nvidia performance and then claim Nvidia is the one doing that when you don’t see bias in the benchmarks.
You do however see bias in AMD games. I’ve preordered games where 7850’s were running much faster then GTX 680’s.
Hey, at least then we would
Hey, at least then we would know if it was time to go pick up an AMD card or not. I’d also like to see GTX 960 data up here, for rounding out with other architectures.
Those people who have bought
Those people who have bought 2 970s should have one refunded, those who have bought 3 970s should have two refunded, and those like me who have bought ONE should keep it. It is a very good card and the hard line gamers are never satisfied never!
We’re often treated to fraps
We’re often treated to fraps fps vs observed fps when multi gpu is tested. I don’t see it in this mini review.Maxwell sLi is BORKED and has been since release dropping frames and runting by the bucketload.This is why some ppl come into the chat screaming shill and the like because it looks to me like pcper is tip-toeing around the issue and yet they went full bore on the AMD frame pacing issues. I repeat again, maxwell sLi is broken, possibly a hardware issue and that is why FCAT testing has fallen out of favour.
Where is FCAT? This site
Where is FCAT? This site promoted and rammed FCAT down everyone’s throat for the past year or so when Nvidia was slightly ahead of AMD in frame Times. But now that has changed, PCPER refuses to use FCAT because it is forbidden by Nvidia.
Any site that don’t show FCAT are clearly under Nvidia PR funding or just biased, and should not be taken seriously.
PS. If I remember correctly PCPER and TechReport worked alongside Nvidia to create FCAT to test AMD cards. So why are things different now for Nvidia?
You are either a troll or not
You are either a troll or not too bright.
They have done the testing you asked for, you are just too stupid to realise it.
lol PCPCER just lost the last
lol PCPCER just lost the last little bit of credibility it had left by not testing AMD R9 290X.290/295X. And by refusing to use FCAT on Nvidia as it did for AMD.
Funny that you posted this
Funny that you posted this comment on the review that is full of FCAT on Nvidia data.
PcPer = nVidia damage control
PcPer = nVidia damage control squad.
If you guys would get off of
If you guys would get off of your fanboy horse for more than 5 seconds, you might realize that Ryan *agrees* that there is a VRAM related issue on the 970's.
Allyn that would require them
Allyn that would require them to use their brains, which they clearly don’t have.
It would require them to take
It would require them to take their AMD blinders off as well.
To go along with the pizza
To go along with the pizza analogy. I would like to say that it doesn’t matter if 7 slices is enough. The point is, I paid for 8, so give me 8, and whether or not I need the 8th is my business.
What a bunch of patented
What a bunch of patented trolls!
“FCAT”, “frame-pacing” benchmarks, “frametime variance” benchmarks ALL are the same and do the exact same damn thing: identifies dropped frames, runt frames, micro-stuttering, and other problems that reduce the visible smoothness of the action on-screen; they analyse the perceived gaming experience the user gets on-screen; what the player’s actually seeing, as far as frametimes are concerned.
Refer to: http://www.geforce.com/hardware/technology/fcat/technology
Refer to: http://www.anandtech.com/show/7195/amd-frame-pacing-explorer-cat138/2
Refer to: http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking
AND STFU!!!
Litigate, litigate, litigate:
Litigate, litigate, litigate:
http://bursor.com/investigations/nvidia/
its a class action lawsuit,
its a class action lawsuit, enjoy the 10$ you are going to win if you are lucky while the lawyer takes the rest. Suckers.