Gaming Performance on Battery

All of the testing on the previous page was done with the MSI GT72 notebook plugged into the wall, getting its maximum power capability and allowing both the GPU and CPU in the system to run at their maximum speed based on the thermal design of the chassis. However, something interesting happens when you unplug a gaming notebook from the wall: available power shrinks dramatically.

Pardon the crappy font here…not sure what happened to NVIDIA's slides…

Not even considering how long the battery will last when gaming, the available maximum power output to the system components is cut by more than half when using a modern notebook running without AC power. Based on our testing while playing games on the system, the GTX 980M-based GT72 notebook would draw just over 200 watts of power while plugged in (measured using Watts Up device) while that same hardware was only drawing between 80-95 watts while running on a battery (measured using the BatteryMon application to measure discharge rate).

That is a significant drop and it causes a lot of changes in the hardware. You'll see much lower clocks on the GTX 980M, lower speeds on the Intel Haswell processor and as a result, lower frame rates when gaming.

To address this, NVIDIA has updated Battery Boost to not only attempt to extend battery life while gaming, but to improve the overall gaming experience when running on a battery. It does so by implementing a secondary set of gaming quality presets in GeForce Experience; now in addition to having a set for maximum power you'll have one that sets the game to play best at the performance levels you'll get when running on the battery.

The result will be different image quality settings when you are running on AC versus when you are running on battery, which might seem like a bad idea at first, but the truth is that it is necessary to guarantee a high quality mobile gaming experience. Let's demonstrate why.

In the following sets of graphs you'll see a couple of games run on the same hardware in four different modes.

  1. On AC power, plugged in at the wall, with the Ultra/Very High quality presets we used in our previous performance testing (GeForce GTX 980M on AC, black line)
  2. The same Ultra/Very High quality settings but running on battery (GeForce GTX 980M on Battery, orange line)
  3. Running the game at the GeForce Experience determined quality settings, using the default frame rate cap of 30 FPS (GeForce GTX 980M Battery Boost, pink line)
  4. Running the game at the same GFE quality settings as above, but moving the frame rate cap up to 60 FPS (GeForce GTX 980M Battery Boost 60 FPS, green line)

It's very important to note that the quality settings of #1 and #2 above are the same, while the quality settings of #3 and #4 are the same, but the two sets are not equal. We are plotting all of this on the same graph so please keep that in mind!

There is a lot of data here, so let's disect. First, look at the Observed FPS graph and at the 980M on AC and on battery at the same image quality settings (black and orange lines) and the different in performance. While on AC the game averaged about 65 FPS on the Ultra preset but on battery, with no image quality changes or enabling GFE presets, the game's average FPS dropped to 37 FPS – a decrease of 75%. You can also see that while running at the slower frame rate, the game also had some instances of very high frame time variance, making a low frame rate even less enjoyable.

Now look at the pink line, representing NVIDIA's GeForce Experience defaults for gamers running BF4 on the battery. The frame rate is 30 FPS thanks to a frame rate cap and slightly lower image quality settings. Yes, that is lower than the average frame rate of the 980M at the Ultra preset shown with the orange line, but the experience of the game is much better with the GFE profile enabled. In my play time there were no stutters, no hitches, though if you are adept at gaming you will likely be able to tell you are running at 30 FPS.

Finally, let's evaluate the green line which represents me changing the frame rate limit from 30 FPS to 60 FPS in GeForce Experience. The result is a higher frame rate, but again, more stutter and more frame variance. Clearly the settings that NVIDIA has selected for this game on battery are tuned to the GTX 980M very specifically and hitting 30 FPS was the goal.

I also tested Crysis 3, one of the most demanding games for modern GPUs. Let's see how it compares to the experience with BF4 above.

Again, please note the image quality settings are different between these lines so be sure you are comparing them in the right mindset. If you see the black line as the optimal performance at the Very High quality settings in-game with AC connected, the GTX 980M is able to pull in an average frame rate of about 35 FPS. Simply unplugging the machine and restarting the game, without making any changes to the game settings, moves that average frame rate (the orange lines) down to 19 FPS or so; that's an 84% drop. Also notice the wide swings in frame times; the on-battery experience is quite bad.

But, utilizing the NVIDIA GFE software once again and enabling a battery-specific preset for image quality and frame rate limit, we are met with a very smooth and consistent 30 FPS gaming experience in Crysis 3. Finally, I pushed the frame rate limit from 30 FPS to 60 FPS at those same settings, and though the average frame rate jumps to about 60, the frame variance goes up considerably. When 10% of your frames are showing more than 4ms of frame time difference, that's a bad result.

« PreviousNext »