Gaming Performance on Battery
All of the testing on the previous page was done with the MSI GT72 notebook plugged into the wall, getting its maximum power capability and allowing both the GPU and CPU in the system to run at their maximum speed based on the thermal design of the chassis. However, something interesting happens when you unplug a gaming notebook from the wall: available power shrinks dramatically.
Pardon the crappy font here…not sure what happened to NVIDIA's slides…
Not even considering how long the battery will last when gaming, the available maximum power output to the system components is cut by more than half when using a modern notebook running without AC power. Based on our testing while playing games on the system, the GTX 980M-based GT72 notebook would draw just over 200 watts of power while plugged in (measured using Watts Up device) while that same hardware was only drawing between 80-95 watts while running on a battery (measured using the BatteryMon application to measure discharge rate).
That is a significant drop and it causes a lot of changes in the hardware. You'll see much lower clocks on the GTX 980M, lower speeds on the Intel Haswell processor and as a result, lower frame rates when gaming.
To address this, NVIDIA has updated Battery Boost to not only attempt to extend battery life while gaming, but to improve the overall gaming experience when running on a battery. It does so by implementing a secondary set of gaming quality presets in GeForce Experience; now in addition to having a set for maximum power you'll have one that sets the game to play best at the performance levels you'll get when running on the battery.
The result will be different image quality settings when you are running on AC versus when you are running on battery, which might seem like a bad idea at first, but the truth is that it is necessary to guarantee a high quality mobile gaming experience. Let's demonstrate why.
In the following sets of graphs you'll see a couple of games run on the same hardware in four different modes.
- On AC power, plugged in at the wall, with the Ultra/Very High quality presets we used in our previous performance testing (GeForce GTX 980M on AC, black line)
- The same Ultra/Very High quality settings but running on battery (GeForce GTX 980M on Battery, orange line)
- Running the game at the GeForce Experience determined quality settings, using the default frame rate cap of 30 FPS (GeForce GTX 980M Battery Boost, pink line)
- Running the game at the same GFE quality settings as above, but moving the frame rate cap up to 60 FPS (GeForce GTX 980M Battery Boost 60 FPS, green line)
It's very important to note that the quality settings of #1 and #2 above are the same, while the quality settings of #3 and #4 are the same, but the two sets are not equal. We are plotting all of this on the same graph so please keep that in mind!
There is a lot of data here, so let's disect. First, look at the Observed FPS graph and at the 980M on AC and on battery at the same image quality settings (black and orange lines) and the different in performance. While on AC the game averaged about 65 FPS on the Ultra preset but on battery, with no image quality changes or enabling GFE presets, the game's average FPS dropped to 37 FPS – a decrease of 75%. You can also see that while running at the slower frame rate, the game also had some instances of very high frame time variance, making a low frame rate even less enjoyable.
Now look at the pink line, representing NVIDIA's GeForce Experience defaults for gamers running BF4 on the battery. The frame rate is 30 FPS thanks to a frame rate cap and slightly lower image quality settings. Yes, that is lower than the average frame rate of the 980M at the Ultra preset shown with the orange line, but the experience of the game is much better with the GFE profile enabled. In my play time there were no stutters, no hitches, though if you are adept at gaming you will likely be able to tell you are running at 30 FPS.
Finally, let's evaluate the green line which represents me changing the frame rate limit from 30 FPS to 60 FPS in GeForce Experience. The result is a higher frame rate, but again, more stutter and more frame variance. Clearly the settings that NVIDIA has selected for this game on battery are tuned to the GTX 980M very specifically and hitting 30 FPS was the goal.
I also tested Crysis 3, one of the most demanding games for modern GPUs. Let's see how it compares to the experience with BF4 above.
Again, please note the image quality settings are different between these lines so be sure you are comparing them in the right mindset. If you see the black line as the optimal performance at the Very High quality settings in-game with AC connected, the GTX 980M is able to pull in an average frame rate of about 35 FPS. Simply unplugging the machine and restarting the game, without making any changes to the game settings, moves that average frame rate (the orange lines) down to 19 FPS or so; that's an 84% drop. Also notice the wide swings in frame times; the on-battery experience is quite bad.
But, utilizing the NVIDIA GFE software once again and enabling a battery-specific preset for image quality and frame rate limit, we are met with a very smooth and consistent 30 FPS gaming experience in Crysis 3. Finally, I pushed the frame rate limit from 30 FPS to 60 FPS at those same settings, and though the average frame rate jumps to about 60, the frame variance goes up considerably. When 10% of your frames are showing more than 4ms of frame time difference, that's a bad result.
Yeah, OK, but “just” and “2
Yeah, OK, but “just” and “2 grand” appearing in the same phrase? I appreciate the performance is really good, but a 970 is $330. A great laptop is $1000, and would have high end processing, SSD, good chunk of RAM, a great screen, etc etc. This is a premium product at a more than premium price point. I would expect and demand top-tier performance.
That said, holy cow it’s a nice part.
Yeah, you don’t buy too many
Yeah, you don’t buy too many laptops do you?
“2500 MHz 7.0 (GT/s)”
In the
“2500 MHz 7.0 (GT/s)”
In the table on the first page, why would you put one in MHz and the other in GT/s?
only a little slower than a
only a little slower than a desktop 970!?! wow I think I found my new laptop
Well there is you specs for
Well there is you specs for the upcoming GTX960. 1280 cores, 192-bit bus, 3GB of DDR5.
i am wondering why review
i am wondering why review sites are acknowledging DX12 (or more more advanced DX version)… Whereas when the actual DX12 gets released by Microsoft, these hardware would have incomplete DX12 implementation.
Good performer, finally a
Good performer, finally a laptop that can really game at its native resolution, but at 2300$+ it’s just not worth it, and you have to admit on a laptop like this, it’s really a transportable desktop rather than a laptop, the thing will always be plugged in a wall….
I want to see a Gigabyte Brix
I want to see a Gigabyte Brix or something like an Intel NUC with one of these new maxwell mobiles squeezed in….heh
The problem has been that the
The problem has been that the Brix are usually quite loud. I'd love to see that fixed!
That’s because they shoved a
That’s because they shoved a desktop GTX 660Ti in there… A GTX 970M/980M would do MUCH better.
Even if you are paying for
Even if you are paying for cheap seo services the company
should still be able to run reports for you so that you always
know where your site is at. • Link building services – SEO Services India builds
links that would get you on the SERP (Search
Engine Results Page). They make sure that they utilize
thousands of blogs, forums, directories, and many others, so that all
these will have a link that is permanent to your website.
Also visit my site … seo optimization
I was wondering…
Some
I was wondering…
Some computers with a 980m will come with a power supply of 180w.
Does that mean that the card will not be able to run at its maximum power ?
Any idea about the 970m ?
Thanks
Not really sure on that. If
Not really sure on that. If the power supply is only 180 watts, then I would assume the GPU would run at lower clocks or Boost less frequenly. Remember that NVIDIA doesn't set TDPs or even typical Boost clocks on mobile parts for this very reason.
Thanks for your answer.
I
Thanks for your answer.
I guess each laptop has to be tested separately.
Is it possible to upgrade the power supply ? Would that solve the problem ?
Depends on the
Depends on the manufacturer.
That’s a lot of heat they have to push out of the small space, especially if you have a single fan setup like the MSI GT series…
MSI GT72 has 2 fans, one for
MSI GT72 has 2 fans, one for the CPU and one for the GPU.
I’ve always been into NVIDIA.
I’ve always been into NVIDIA. Since I’m a game tester for PC I rely on top notch awesomeness!
-Geller Fastic
readyplayearn.com
http://www.xoticpc.com/gigaby
http://www.xoticpc.com/gigabyte-p35xv3cf2-eta-early-january-p-7620.html
Tell me what y’all think