Another Wrench – GeForce GTX 760M Results
We found some interesting issues with the GeForce GTX 760M with our Frame Rating testing.
Just recently, I evaluated some of the current processor-integrated graphics options from our new Frame Rating performance metric. The results were very interesting, proving Intel has done some great work with its new HD 5000 graphics option for Ultrabooks. You might have noticed that the MSI GE40 didn’t just come with the integrated HD 4600 graphics but also included a discrete NVIDIA GeForce GTX 760M, on-board. While that previous article was to focus on the integrated graphics of Haswell, Trinity, and Richland, I did find some noteworthy results with the GTX 760M that I wanted to investigate and present.
The MSI GE40 is a new Haswell-based notebook that includes the Core i7-4702MQ quad-core processor and Intel HD 4600 graphics. Along with it MSI has included the Kepler architecture GeForce GTX 760M discrete GPU.
This GPU offers 768 CUDA cores running at a 657 MHz base clock but can stretch higher with GPU Boost technology. It is configured with 2GB of GDDR5 memory running at 2.0 GHz.
If you didn’t read the previous integrated graphics article, linked above, you’re going to have some of the data presented there spoiled and so you might want to get a baseline of information by getting through that first. Also, remember that we are using our Frame Rating performance evaluation system for this testing – a key differentiator from most other mobile GPU testing. And in fact it is that difference that allowed us to spot an interesting issue with the configuration we are showing you today.
If you are not familiar with the Frame Rating methodology, and how we had to change some things for mobile GPU testing, I would really encourage you to read this page of the previous mobility Frame Rating article for the scoop. The data presented below depends on that background knowledge!
Okay, you’ve been warned – on to the results.
3DMark (2013) Cloud Gate and Skyrim
With an graphics score advantage of about 2.15x over the next closest competitor, the NVIDIA GeForce GTX 760M is clearly in a different class when it comes to raw GPU performance. The integrated solutions from AMD and Intel aren’t able to keep up at all proving there is still a lot of room for discrete graphics solutions in mobile platforms.
The first game we tested was Skyrim and things look like you would expect – the GTX 760M is able to bring in a much higher average frame rate (115 FPS) than the integrated solutions and does so while having a smooth and tight frame time graph as well. Frame time variance never goes over the 2 ms mark and all is well. But something odd crept up in other games…
Does it matter what the game
Does it matter what the game settings were for these tests?
How come the frame buffer has
How come the frame buffer has to be copied? Why can’t the internal display have two inputs (one from IGP and one from Nvidia) and when you launch a game you switch the display to use the Nvidia output?
Just not how it works…
Just not how it works…
As in it’s not technically
As in it’s not technically feasible or they just didn’t implement it that way?
“Then, by sorting these frame
“Then, by sorting these frame times and plotting them in a percentile form we can get an interesting look at potential stutter.”
Ryan,
No doubt I am an idiot, but I just do not get what the percentiles represent. I have read all the frame rating articles, so it is not for a lack of trying. Please explain so an idiot can understand. Thanks much.
My only guess is that it is the absolute number of results that fall in that percentile. Why is the most variance always in the largest percentile?
Percentiles tell you how many
Percentiles tell you how many of a data point occur at or above that result.
For example, on the Bioshock page, look at the FPS by Percentile graph. The blue line representing the GE40 with the GTX 760M enabled is a measurement of frame rates / frame times by percentile. So – look at the 80th percentile and look up – the blue line is hitting about 95 FPS. That the GTX 760M is running at LESS THAN (to the right) 95 FPS for the 20% of the time. And thus, 80% of the time, the avg frame rate is higher than 95 FPS (to the left).
That help?
Got it. Just before I read
Got it. Just before I read your explanation, I thought of it in terms of test scores. With those, of course, if you are in the 90th percentile, your score is in the top 10%; alternatively, your score is higher than 90% of the rest of the scores. Your explanation confirms this of course. I should have thought of this first. Thanks much and keep up the awesome work.
so if i am reading this
so if i am reading this report correctly- the software choice to run sli between the laptop graphics and the nvidia discreet graphics card has now introduced framerating issues like AMD has.
Most probably because the the Intel CPU does not contain the hardware segment that prevents the issues when you sli parts- though i am probably talking out of my butt- since no one’s really thought out the issues of xfire x yet …
but the scenario is the same you would probably see if this was an amd platform with a xfire x running – though that set up would be worse than the nvidia issue.
SO after all that talking- am i right or wrong- i really hate being wrong all the time…..
It’s not quite the same
It's not quite the same problem that exists with AMD on the desktop side…not nearly as dramatic.
As for if AMD sees the same problem in notebooks…we are working on getting a platform to test!
Memory bottleneck during the
Memory bottleneck during the transfer from the discrete GPUs VRAM and main memory prehaps? Possibly paging related as well (I note the HDD isn’t listed; was a SSD used, or a traditional HDD?)
Possibly on the memory
Possibly on the memory transfer sepeed, but definitely not paging related.
NVIDIA has told me a fix is in the works.
So NVIDIA acknowledged the
So NVIDIA acknowledged the issue and told you a fix was in the works back in July. Has this problem been resolved? I recently purchased a Sager NP8235 that runs a GTX 770M in Optimus configuration, and I believe I am seeing the same issues. I am running the latest drivers, so I do not believe a fix has been implemented. Could you talk to the people you know at NVIDIA again to get an update on this issue?
This issue is pretty huge since pretty much 100% of notebook computers that are sold with NVIDIA GPUs today all run Optimus configurations. This essentially means that IF this issue is not actually fixable through a driver update, then 100% of the NVIDIA gaming laptops being sold today (and the majority that have been sold over the past couple years) are useless!
I hope the MSI has an option
I hope the MSI has an option in the BIOS to just “turn off” the IGP so Windows won’t detect it and just run entirely off the nvidia gpu, but that defeats the purpose of long battery life for portability….
Optimus still has its issues but I’m sure its more stable than what it is with my laptop which as a Gt540m and I had to reinstall the OS a few times already because of optimus not kicking on the GPU…
I think that testing 1080P
I think that testing 1080P gaming on a notebook is a mistake. I know that PCPer caters to gamers and DIY and gearhead types, but if you go to Newegg or Tiger Direct, 75% of the notebooks being sold (under $1200) have a resolution of 1366×768 or less. Anybody can attach an external monitor, but how many people actually do?
If the only difference between an A8 notebook and an i3 notebook was the framerate of BF3, I’d want to know what the framerate is like when I’m sitting in the commons at school, not plugged in at my lab at home.
Eva. although Tony`s blurb is
Eva. although Tony`s blurb is impossible… last friday I got a great new Ford from having earned $6437 this past four weeks and-in excess of, 10 grand last month. it’s by-far the most-comfortable work Ive ever done. I began this 8-months ago and pretty much immediately started to earn more than $69 per-hr. I went to this site,, goo.gl/oR8j0
I just picked up a Razer
I just picked up a Razer blade with a similar setup (haswell/gtx765) and noticed an abnormal amount of tearing immediately. I hope they fix this, it’s quite obvious.
This is the same problem my
This is the same problem my EeePC 1215N has (Atom™ D525 + ION2 w/ Optimus). Many people reported tearing and stuttering on the laptops LCD but no problems when connected to external LCD over HDMI. The top thought was the PCIE X 1 interface between the ION and Atom was the culprit but that doesn’t make sense if hdmi out is good. BTW nether ASUS or NVidia could really fix the problem and left many disappointed owners. I typically game on my GE40 connected to an Ext LCD so I haven’t had a problem yet (other than Steam’s App forcing the NVidia GPU on all the time). But as of this post neither MSI nor NVidia have bios over 311 that support the GTX7XX series.
What settings were used for
What settings were used for all of these tests?
I just bought GE40, in the
I just bought GE40, in the system properties it’s shown that GTX760M is installed. However, in the display properties its not shown and what’s only available is the shared system memory.
Is this something normal for these new VGA?
Anyone please advise thanks
I just bought GE40, in the
I just bought GE40, in the system properties it’s shown that GTX760M is installed. However, in the display properties its not shown and what’s only available is the shared system memory.
Is this something normal for these new VGA?
Anyone please advise thanks