DiRT 3 and Closing Thoughts
We see a very similar pattern of results with DiRT 3 – the GTX 760M system exhibits frame issues that cause a difference in the FRAPS reported performance and the actually, observed performance. Please keep in mind that this is exactly why we developed and have promoted the Frame Rating performance testing methodology, in order to find issues like this that might not otherwise have been discovered.
This phenomenon occurred in our Battlefield 3 testing as well, but did NOT occur in Left 4 Dead 2, so I am not certain yet what is causing the problem as it is application dependent. After talking with NVIDIA about it briefly, they were able to replicate the issues but couldn’t yet say if the bug was with Intel’s GFX driver, NVIDIA’s driver or perhaps something to do with the hardware bandwidth limitations.
Screenshots and Evidence
So what exactly is going on? After diving into the recorded video from our Frame Rating captures, we found these interesting results.
What are you seeing in the first image, and then zoomed in on in the second image, is a kind of frame transfer bug that causes a frame to be partially sent to the frame buffer. That transfer gets stalled or halted momentarily and thus the previous frame (in purple) is sent to the display again until the actual next frame (olive-ish) is fully transferred. While you might miss this in playing the game, if ENOUGH of it happens you will definitely experience this tearing. If you look back at the Frame Times graph for DiRT 3 above, any time you see the blue line jut down suddenly, we are seeing one of these "tears".
Here is another example of the problem we saw with the HD 4600 + GTX 760M combination. This time, the shift happens between the yellow and orange overlay segments.
Closing Thoughts
The tearing of these frames is obviously an issue for NVIDIA and its Optimus technology that needs to be addressed. NVIDIA has been very proud of its frame pacing technology on the desktop side but my guess is that this is an unintended consequence of Optimus. In my gameplay testing the performance of the GTX 760M is so much higher than that of the integrated graphics that going discrete is still a much better solution for a mobile user with a focus on gaming.
It seems to be an Optimus specific issue. When I did my testing with the MSI GT60 that came with the GTX 780M graphics, it did not exhibit any of these problems. It did NOT utilize Optimus technology though – something I thought was odd when first receiving the sample; now it kind of makes more sense. Also, depending on the exact configuration of the notebook, some external outputs can bypass the Optimus data path and connect directly to the discrete GPU’s frame buffer, relieving the need to copy data from discrete to integrated graphics memory. The notebook display though, cannot be bypassed in that way.
We’ll be doing more testing with additional notebooks (both AMD and NVIDIA discrete based) in the next couple of weeks so you can expect some more analysis of this issue. Also, once NVIDIA has some kind of official response on the topic, we’ll update our story as necessary.
Does it matter what the game
Does it matter what the game settings were for these tests?
How come the frame buffer has
How come the frame buffer has to be copied? Why can’t the internal display have two inputs (one from IGP and one from Nvidia) and when you launch a game you switch the display to use the Nvidia output?
Just not how it works…
Just not how it works…
As in it’s not technically
As in it’s not technically feasible or they just didn’t implement it that way?
“Then, by sorting these frame
“Then, by sorting these frame times and plotting them in a percentile form we can get an interesting look at potential stutter.”
Ryan,
No doubt I am an idiot, but I just do not get what the percentiles represent. I have read all the frame rating articles, so it is not for a lack of trying. Please explain so an idiot can understand. Thanks much.
My only guess is that it is the absolute number of results that fall in that percentile. Why is the most variance always in the largest percentile?
Percentiles tell you how many
Percentiles tell you how many of a data point occur at or above that result.
For example, on the Bioshock page, look at the FPS by Percentile graph. The blue line representing the GE40 with the GTX 760M enabled is a measurement of frame rates / frame times by percentile. So – look at the 80th percentile and look up – the blue line is hitting about 95 FPS. That the GTX 760M is running at LESS THAN (to the right) 95 FPS for the 20% of the time. And thus, 80% of the time, the avg frame rate is higher than 95 FPS (to the left).
That help?
Got it. Just before I read
Got it. Just before I read your explanation, I thought of it in terms of test scores. With those, of course, if you are in the 90th percentile, your score is in the top 10%; alternatively, your score is higher than 90% of the rest of the scores. Your explanation confirms this of course. I should have thought of this first. Thanks much and keep up the awesome work.
so if i am reading this
so if i am reading this report correctly- the software choice to run sli between the laptop graphics and the nvidia discreet graphics card has now introduced framerating issues like AMD has.
Most probably because the the Intel CPU does not contain the hardware segment that prevents the issues when you sli parts- though i am probably talking out of my butt- since no one’s really thought out the issues of xfire x yet …
but the scenario is the same you would probably see if this was an amd platform with a xfire x running – though that set up would be worse than the nvidia issue.
SO after all that talking- am i right or wrong- i really hate being wrong all the time…..
It’s not quite the same
It's not quite the same problem that exists with AMD on the desktop side…not nearly as dramatic.
As for if AMD sees the same problem in notebooks…we are working on getting a platform to test!
Memory bottleneck during the
Memory bottleneck during the transfer from the discrete GPUs VRAM and main memory prehaps? Possibly paging related as well (I note the HDD isn’t listed; was a SSD used, or a traditional HDD?)
Possibly on the memory
Possibly on the memory transfer sepeed, but definitely not paging related.
NVIDIA has told me a fix is in the works.
So NVIDIA acknowledged the
So NVIDIA acknowledged the issue and told you a fix was in the works back in July. Has this problem been resolved? I recently purchased a Sager NP8235 that runs a GTX 770M in Optimus configuration, and I believe I am seeing the same issues. I am running the latest drivers, so I do not believe a fix has been implemented. Could you talk to the people you know at NVIDIA again to get an update on this issue?
This issue is pretty huge since pretty much 100% of notebook computers that are sold with NVIDIA GPUs today all run Optimus configurations. This essentially means that IF this issue is not actually fixable through a driver update, then 100% of the NVIDIA gaming laptops being sold today (and the majority that have been sold over the past couple years) are useless!
I hope the MSI has an option
I hope the MSI has an option in the BIOS to just “turn off” the IGP so Windows won’t detect it and just run entirely off the nvidia gpu, but that defeats the purpose of long battery life for portability….
Optimus still has its issues but I’m sure its more stable than what it is with my laptop which as a Gt540m and I had to reinstall the OS a few times already because of optimus not kicking on the GPU…
I think that testing 1080P
I think that testing 1080P gaming on a notebook is a mistake. I know that PCPer caters to gamers and DIY and gearhead types, but if you go to Newegg or Tiger Direct, 75% of the notebooks being sold (under $1200) have a resolution of 1366×768 or less. Anybody can attach an external monitor, but how many people actually do?
If the only difference between an A8 notebook and an i3 notebook was the framerate of BF3, I’d want to know what the framerate is like when I’m sitting in the commons at school, not plugged in at my lab at home.
Eva. although Tony`s blurb is
Eva. although Tony`s blurb is impossible… last friday I got a great new Ford from having earned $6437 this past four weeks and-in excess of, 10 grand last month. it’s by-far the most-comfortable work Ive ever done. I began this 8-months ago and pretty much immediately started to earn more than $69 per-hr. I went to this site,, goo.gl/oR8j0
I just picked up a Razer
I just picked up a Razer blade with a similar setup (haswell/gtx765) and noticed an abnormal amount of tearing immediately. I hope they fix this, it’s quite obvious.
This is the same problem my
This is the same problem my EeePC 1215N has (Atom™ D525 + ION2 w/ Optimus). Many people reported tearing and stuttering on the laptops LCD but no problems when connected to external LCD over HDMI. The top thought was the PCIE X 1 interface between the ION and Atom was the culprit but that doesn’t make sense if hdmi out is good. BTW nether ASUS or NVidia could really fix the problem and left many disappointed owners. I typically game on my GE40 connected to an Ext LCD so I haven’t had a problem yet (other than Steam’s App forcing the NVidia GPU on all the time). But as of this post neither MSI nor NVidia have bios over 311 that support the GTX7XX series.
What settings were used for
What settings were used for all of these tests?
I just bought GE40, in the
I just bought GE40, in the system properties it’s shown that GTX760M is installed. However, in the display properties its not shown and what’s only available is the shared system memory.
Is this something normal for these new VGA?
Anyone please advise thanks
I just bought GE40, in the
I just bought GE40, in the system properties it’s shown that GTX760M is installed. However, in the display properties its not shown and what’s only available is the shared system memory.
Is this something normal for these new VGA?
Anyone please advise thanks