Mobile GPU Testing with Frame Rating and FCAT
But I am here today to talk to you about performance. Particularly of the new GeForce GTX 780M flagship GPU. Testing of notebook GPUs is often more difficult with the standard lack of removable components and this article sees no different. Our GTX 780M performance test system is the MSI GT60 based on a Haswell platform. The competing Radeon HD 7970M is tested on an Alienware M17x with an Ivy Bridge processor.
Keep in mind that AMD recently announced the new Radeon HD 8970M – we are waiting on our review sample of that GPU to reach our offices and we’ll do another story on that in the next week or two. I don’t expect performance to increase more than 5-10% from the HD 7970M though.
GeForce GTX 780M Test System Setup | |
CPU | Intel Core i7-4770MQ (Haswell) |
Motherboard | Mobile H87 Platform |
Memory | 16GB DDR3-1600 |
Hard Drive | 120GB SSD |
Sound Card | On-board |
Graphics Card | NVIDIA GeForce GTX 780M 4GB |
Graphics Drivers | 320.21 |
Power Supply | Internal |
Operating System | Windows 8 Pro x64 |
Radeon HD 7970M Test System Setup | |
CPU | Intel Core i7-3720QM (Ivy Bridge) |
Motherboard | Mobile H77 Platform |
Memory | 8GB DDR3-1600 |
Hard Drive | 750GB HDD 7200 RPM |
Sound Card | On-board |
Graphics Card | AMD Radeon HD 7970M 3GB |
Graphics Drivers | 13.6 Beta |
Power Supply | Internal |
Operating System | Windows 7 x64 |
Even though the processors are very different, both provide an ample enough base system to run our gaming tests on. We already tested Haswell in the desktop configuration and found it to be faster than Ivy Bridge – but not by much and especially not in gaming workloads. The only area we will make note of is in idle power consumption.
To test the GTX 780M against the Radeon HD 7970M we used our Frame Rating capture-based system of GPU performance evaluation. This method uses hardware-based external capture and an on-system overlay to better evaluate real-world performance much different than software like FRAPS can. For this to work of course we had to connect these test system to an external display, both using mini-DisplayPort. Testing was done at 1920×1080, the native resolution of both notebooks.
Full details on our Frame Rating metrics, how the testing is done and how to read the graphs of data it generates are below! If you alread know all about our testing process and data, you can skip right ahead to the next page for our results!
The PCPER FRAPS File
While the graphs above are produced by the default version of the scripts from NVIDIA, I have modified and added to them in a few ways to produce additional data for our readers. The first file shows a sub-set of the data from the RUN file above, the average frame rate over time as defined by FRAPS, though we are combining all of the GPUs we are comparing into a single graph. This will basically emulate the data we have been showing you for the past several years.
The PCPER Observed FPS File
This graph takes a different subset of data points and plots them similarly to the FRAPS file above, but this time we are look at the “observed” average frame rates, shown previously as the blue bars in the RUN file above. This takes out the dropped and runts frames, giving you the performance metrics that actually matter – how many frames are being shown to the gamer to improve the animation sequences.
As you’ll see in our full results on the coming pages, seeing a big difference between the FRAPS FPS graphic and the Observed FPS will indicate cases where it is likely the gamer is not getting the full benefit of the hardware investment in their PC.
The PLOT File
The primary file that is generated from the extracted data is a plot of calculated frame times including runts. The numbers here represent the amount of time that frames appear on the screen for the user, a “thinner” line across the time span represents frame times that are consistent and thus should produce the smoothest animation to the gamer. A “wider” line or one with a lot of peaks and valleys indicates a lot more variance and is likely caused by a lot of runts being displayed.
The RUN File
While the two graphs above show combined results for a set of cards being compared, the RUN file will show you the results from a single card on that particular result. It is in this graph that you can see interesting data about runts, drops, average frame rate and the actual frame rate of your gaming experience.
For tests that show no runts or drops, the data is pretty clean. This is the standard frame rate per second over a span of time graph that has become the standard for performance evaluation on graphics cards.
A test that does have runts and drops will look much different. The black bar labeled FRAPS indicates the average frame rate over time that traditional testing would show if you counted the drops and runts in the equation – as FRAPS FPS measurement does. Any area in red is a dropped frame – the wider the amount of red you see, the more colored bars from our overlay were missing in the captured video file, indicating the gamer never saw those frames in any form.
The wide yellow area is the representation of runts, the thin bands of color in our captured video, that we have determined do not add to the animation of the image on the screen. The larger the area of yellow the more often those runts are appearing.
Finally, the blue line is the measured FPS over each second after removing the runts and drops. We are going to be calling this metric the “observed frame rate” as it measures the actual speed of the animation that the gamer experiences.
The PERcentile File
Scott introduced the idea of frame time percentiles months ago but now that we have some different data using direct capture as opposed to FRAPS, the results might be even more telling. In this case, FCAT is showing percentiles not by frame time but instead by instantaneous FPS. This will tell you the minimum frame rate that will appear on the screen at any given percent of time during our benchmark run. The 50th percentile should be very close to the average total frame rate of the benchmark but as we creep closer to the 100% we see how the frame rate will be affected.
The closer this line is to being perfectly flat the better as that would mean we are running at a constant frame rate the entire time. A steep decline on the right hand side tells us that frame times are varying more and more frequently and might indicate potential stutter in the animation.
The PCPER Frame Time Variance File
Of all the data we are presenting, this is probably the one that needs the most discussion. In an attempt to create a new metric for gaming and graphics performance, I wanted to try to find a way to define stutter based on the data sets we had collected. As I mentioned earlier, we can define a single stutter as a variance level between t_game and t_display. This variance can be introduced in t_game, t_display, or on both levels. Since we can currently only reliably test the t_display rate, how can we create a definition of stutter that makes sense and that can be applied across multiple games and platforms?
We define a single frame variance as the difference between the current frame time and the previous frame time – how consistent the two frames presented to the gamer. However, as I found in my testing plotting the value of this frame variance is nearly a perfect match to the data presented by the minimum FPS (PER) file created by FCAT. To be more specific, stutter is only perceived when there is a break from the previous animation frame rates.
Our current running theory for a stutter evaluation is this: find the current frame time variance by comparing the current frame time to the running average of the frame times of the previous 20 frames. Then, by sorting these frame times and plotting them in a percentile form we can get an interesting look at potential stutter. Comparing the frame times to a running average rather than just to the previous frame should prevent potential problems from legitimate performance peaks or valleys found when moving from a highly compute intensive scene to a lower one.
While we are still trying to figure out if this is the best way to visualize stutter in a game, we have seen enough evidence in our game play testing and by comparing the above graphic to other data generated through our Frame rating system to be reasonably confident in our assertions. So much in fact that I am going to going this data the PCPER ISU, which beer fans will appreciate the acronym of International Stutter Units.
To compare these results you want to see a line that is as close the 0ms mark as possible indicating very little frame rate variance when compared to a running average of previous frames. There will be some inevitable incline as we reach the 90+ percentile but that is expected with any game play sequence that varies from scene to scene. What we do not want to see is a sharper line up that would indicate higher frame variance (ISU) and could be an indication that the game sees microstuttering and hitching problems.
-dusts hands-
Alright,
-dusts hands-
Alright, gentleman, it’s been done. Desktop GPU in a notebook. Good game everyone, good game.
Nothing more can be done; laptop gaming has reached the point where it has the same performance as a desktop. (No, not the same performance, but daaamn it’s getting there.)
Are you out of your mind
Are you out of your mind lol?
you cant begin to compare the two.
Yeh maybe a recent mobility graphcics card is at the same performance as one say, 2 – 3 years ago of a pc card
You do realise that the 780m
You do realise that the 780m is a desktop 680 to be able to fit that in a laptop nowadays is incredible
E3 AND a gpu review, clearly
E3 AND a gpu review, clearly there is nothing Ryan cannot do.
I would like to see the other mGPU’s also put to the test, but I guess that’ll probably happen by proxy with upcoming notebook reviews.
Also, the link under Metro is for Dirt3.
Thanks for the note on the
Thanks for the note on the Metro page; updated!
This has become my favorite
This has become my favorite GPU review site due to the great accuracy and detail of latencies and actual frames displayed.
PC PERSPECTIVE has really raised the bar on GPU gaming testing. Lets hope the tech world takes notice.
Thank you!!
Thank you!!
Mi desarrollador está
Mi desarrollador está tratando de persuadir
convencer que me mude a la red a partir de PHP.
. Siempre me ha disgustado la idea debido a los gastos
My page :: memorias usb de figuras en queretaro
I’m concerned for AMD. I hope
I’m concerned for AMD. I hope they get well soon.
Oh, and nice review.
Radeon 8xxx series soon no?
Radeon 8xxx series soon no? Only time will tell. 😉
hmm interesting for a almost
hmm interesting for a almost 2K dollars sounds good for me but how about the temp. of this ?
I’ve got to add, love your
I’ve got to add, love your reviews, the frame latency adds so much depth to your reviews. Average FPS only reviews hold no muster to your own reviews.
Keep the great reviews rolling.
And yet when i play skyrim
And yet when i play skyrim and look down from dragonreach in wightrun i dip down to 15 fps on my reference gtx 780…i have no idea
Can you guys include or add
Can you guys include or add in a release date for it, because I’ve been trying to find it all over the “interwebs” and I just can’t seem to find it anywhere at all.
I am running a 780m in a msi
I am running a 780m in a msi Gt70. Metro last light drops to around 28fps with MSAA 4x on. Need to drop to 2x for smoother ride. Metro 2033 runs worse than last light so clearly improved coding.