GPU Boost 2.0

When the GTX 680 launched with the first version of GPU Boost, most people saw it as a great feature that enabled clock speeds of a GPU to go up when thermal headroom allowed.  It was elegant in action but very difficult to explain without seeing it demonstrated.  For that reason, we had NVIDIA's Tom Petersen come out to our offices to do a live stream and explain it.  We repeated this for the GTX 690 and a few other launches as well.  (Note: we are doing this same live event on the launch day of the GTX TITAN – Thursday, February 21st – so be sure to join us!)

With TITAN, NVIDIA will be releasing an updated version of GPU Boost they claim will allow for even higher clocks on the GPU than the original technology would have allowed.  This time the key measurement isn't power but temperature.

GPU Boost 1.0

In the first version of GPU Boost there was (and still is) a maximum clock set in the firmware of the card that no amount of tweaking would allow you to breach.  While this number isn't advertised or easily discovered on a spec sheet, it was the maximum level NVIDIA would let the chip run at to avoid possible danger to the silicon due to high voltage and high temperature.

GPU Boost 2.0

This updated version of GPU Boost can increase the maximum clock rate because the voltage level is controlled by a known, easily measured data point: temperature.  By preventing a combination of high voltages and high temperatures that might break a chip, NVIDIA can increase the voltage on a chip-to-chip basis to increase the overall performance of the card in most cases. 

Even better: NVIDIA is going to once again allow for the ability to exceed the base voltage to another unknown maximum.  This marks a return of overvolting cards (at least with the TITAN) but when asked NVIDIA said this feature would not be re-enabled on the GTX 600-series.

For those that are curious, NVIDIA gave us some more information on how overvolting will work.  It must be disabled by default and the user must acknowledge a warning message before enabling it in software.  It can be persistent across reboots of the PC and then the max voltage of the GPU can be adjusted by the end user.  This is completely optional though and it will be up to the card vendors to enable it.  I would think that finding a GTX TITAN without support for overvolting would be impossible but as we see future card releases with GPU Boost 2.0 technology it is something we'll keep an eye on.

With these changes NVIDIA has moved the bell curve of frequency distribution toward higher clocks.  By utilizing overvolting on the chip you can actually expand the max clock to the right even a bit further.

The shift in focus to the temperature of the GPU with version 2.0 of this feature means that the likelihood of the GPU being at the target temperature is much more likely than it would have been before.  The above graph shows GPU Boost 2.0 temperatures in yellow and GPU Boost 1.0 temperatures in grey.  You can clearly see that the curve has been constricted around the 80C mark (the default setting) and that the GPU is both BELOW and ABOVE 80C much less often.

Just like you could adjust the power target to overclock your GTX 600-series card, you can now adjust the temperature target with GTX TITAN to allow it to increase clocks and voltages. 

What does a 10C bump in your target temperature mean to your clock speeds?  That. 

NVIDIA is very conscious about the noise levels of TITAN and want to make sure they don't get out of hand like some HD 7970s can do.  TITAN runs nearly silently in our testing until it gets near the 80C mark and then the fan speed begins to ramp up. 

Adjusting the temperature offset doesn't automatically change the fan schedules but…

…you can offset the fan curves as well manually to match any other setting changes.

This new version of GPU Boost definitely seems more in line with the original goals of the technology but there are some interesting caveats.  First, you'll quickly find that the clock speeds of TITAN will start out higher on a "cold" GPU and then ramp down as the temperature of the die increases.  This means that doing quick performance checks of the GPU using 3DMark or even quick game launches will result in performance measurements that are higher than they would be after 5-10 minutes of gaming.  As a result. our testing of TITAN required us to "warm up" the GPU for a few minutes before every benchmark run. 

Also, defaulting the GPU to target 80C seems more than a bit low for our tastes and buyers of a $999 graphics cards are likely more interested in performance than slightly less noise from the coolers fan.  It kind of seems like NVIDIA is straddling the line between wanting a higher performing GPU and being able to say it is the quietest GPU by X percent.  If NVIDIA was willing to have a slightly louder configuration out of the box with GK110 how much higher would the average clocks be? 

As I spend more time with this hardware an the new GPU Boost 2.0 I am sure we'll find out more.

« PreviousNext »