GPU Boost 3.0, Overclocking, HDR and Display Support
Overclocking on Pascal and the GTX 1080 will get a nice upgrade thanks to GPU Boost 3.0. Out of the box, GPU Boost 3.0 will perform nearly identically to previous GPU Boost technologies, increase the Boost clock until the GPU hits a limit of temperature or power. The base clock on the GTX 1080 is 1607 MHz with a rated Boost clock of 1733 MHz and in my testing we saw actual clocks range from 1865 MHz to 1630 MHz or so.
But GPU Boost 3.0 does offer a new feature for more granular overclocking. Both users and AIB partners can now set frequency offsets for individual voltages points rather than a single, global offset for all voltage points a GPU might go through. The result is more efficient use of the theoretical maximum clock speed for the silicon.
To take advantage of that capability, tools like Precision X are going to be updated to allow users to manually set those offsets by voltage points AND will include a tool that will attempt to auto-overclock your graphics card at each voltage point based on parameters you set! Three modes will now be available in EVGA’s Precision XOC.
- Basic
- This will overclock the GTX 1080 like GPU Boost 2.0 did – applying a set offset to all voltage points.
- Linear
- This will you to set a linear increase in the offset from the lowest to the highest voltage point. You can adjust the grade of this slope so you can have higher offsets at lower voltages that taper off as the voltage increases, for example.
- Manual
- As you might guess, this allows users to manually input the offset for each voltage offset / column.
The OC Scanner is the tool and I most excited about. If you have seen or used software-based overclocking tools like the ones that ASUS includes in its AI Suite tools, then the process for the GTX 1080 will seem familiar. Precision XOC will incrementally test clock offsets at each voltage point until it detects instability or corruption in the image, at which point it will set the highest previously stable point as the offset for that specific voltage. It repeats the process for each voltage point on the curve, creating a custom overclocking map for your graphics card.
Options for running OC Scanner include how long it should test each VF point, what offset (MHz) the scanner should start and end at, as well as how granular the voltage steps should be. The more granular you make the voltage steps the more accurate your curve will be and the more stable the overclock, but it will definitely take more time.
NVIDIA does want to warn users that this overclocking/scanning process may pause during a timeout / driver crash. However, the OC Scanner is smart enough to wait for the driver to recover and resume the overclocking process.
Early Overclocking Results
For our overclocking testing, the process was very similar to previous GeForce GTX overclocking sessions using GPU Boost. I used the new version of Precision X, now called Precision XOC, in the basic mode to raise power targets and set a clock speed offset.
First I pulled up the power target to its maximum rating of 120%, giving us, in theory, a peak power draw rat of 216 watts. The temperature target shows 92%…but I am assuming that's a typo and it's meant to be 92C. Even with that raised, our card never went over 85C during overclocking.
Click for Larger Version
It quickly became obvious to me that the GP104 GPU will scale if given the ability. Without so much as a hiccup, our Founders Edition card was able to cross the 2.0 GHz mark with a +200 MHz offset!
In our stock testing, GPU clocks in Unigine Heaven hovered around 1700 MHz. Combining a 200 MHz offset, along with the power target increase, resulted in a usable clock rate increase of more than 300 MHz! Seeing a GPU run at higher than 2000 MHz on air, without much effort, is still a stunning site.
Of course I wanted to try out the other overclocking modes listed above, linear and the auto scanning.
As described above, by clicking on the left hand side of the graph and the right hand side, you can adjust the slope of the offset per voltage point. In this example, I created a nearly flat offset, that raises clock speeds higher at the beginning of the voltage curve but minimizes it as you use more power. I'm not telling you this is the idle curve, but it is interesting that GPU Boost 3.0 gives us this kind of capability.
And if you want the ultimate in granularity, manual mode will give you that, enabling an easy click point for each voltage setting to enable different offsets. I randomly picked some as you can see here, including lowering the clock at a couple of voltage points just off center (lower blue bars) to see if that was possible. This mode is really meant for the most hardcore of GPU enthusiasts, as it will require a lot of guess work and testing time.
A solution to that is supposed to be EVGA's Precision XOC OC Scanner mode.
Click to Enlarge
The goal here is to automatically run through the manual option, having the software increase clock speed and run stability tests in the background. Once it notices instability or artifacts, it would pick the last known good setting and move on to the next voltage point. Unfortunately, in the time I had with it, the process was incredibly unstable, crashing the driver and not auto-resuming as expected. I attempted to use this process a half dozen times or so, including between reboots, but never had more than one or two voltage points complete before the system refused to keep going.
Hopefully NVIDIA and EVGA work on this some more to bring it to a point where we can actually utilize it – it might be the coolest addition to GPU overclocking in years!
HDR and Display Technologies
The next big shift in display technology is known as HDR, high dynamic range, and if you haven’t seen it in person, then the advantages in image quality it offers are hard to describe. The BT.2020 color space covers 75% of the visible color spectrum while the current sRGB standard most monitors are judged by only covers 33%. Contrast ratios and brightness will also see a big increase with HDR displays, with many LCD options going over 1000 nits!
Maxwell GPUs already support quite a few HDR display capabilities including 12-bit color, BT.2020 wide color gamut and the Dolby SMPTE 2084 algorithm to map HDR video to BT.2020 color space. HDMI 2.0b support is also in the Maxwell / GTX 980 GPU that includes support for 10/12-bit for 4K displays.
Pascal adds to that list by adding 4K 60 Hz 10-bit and 12-bit HEVC decode acceleration, 4K 60 Hz 10-bit HEVC encoding and DisplayPort 1.4 ready HDR metadata transport.
NVIDIA is enabling HDR through GameStream as well, meaning you can stream your PC games in HDR through your local network to an NVIDIA SHIELD device connected to your HDR display in another room.
Getting HDR in games isn’t going to be a terribly difficult task, but it isn’t a simple switch either. NVIDIA is working with developers to help them bring HDR options to current and upcoming games including The Witness, Lawbreakers, Rise of the Tomb Raider, Paragon, The Talos Principle and Shadow Warrior 2.
Just like the GeForce GTX 980, the new GTX 1080 can support up to four simultaneous display outputs with a maximum number of output connections of 6 per card. Maximum resolution support jumps from 4K to 7K through a pair of DisplayPort 1.3 connection. GP104 is DisplayPort 1.2 certified and DP 1.3/1.4 ready for when the standard finally gets ratified. It’s a bit risky to claim support for DP 1.3/1.4 without knowing 100% how the standard will turn out, but NVIDIA is confident they have the hardware in place for any kind of last minute changes.
And for those of you keeping track at home of the supported video features of the GTX 1080, a table is supplied above.
Curious if the Oculus Rift
Curious if the Oculus Rift pushes the bandwidth over the limit for the standard bridge.. 2160×1200@ 90 hz is slightly more bandwidth intensive than 2560×1440@ 60 hz..
I’m confused about wether the
I’m confused about wether the older SLI bridges are backwards compatible with the 1080 cards?
This doesn’t make sense:
”
The original SLI bridges that you might have several of from motherboards over the years only are recommended for single display configurations of up to 2560×144 @ 60 Hz. If you have one of the LED bridges you can properly integrate high refresh rate 2560×1440 displays as well as 4K monitors. If you want to push into 5K or Surround gaming though, NVIDIA will recommend one of the new high bandwidth SLI bridges.”
Are they referring to older gen cards or to all including the 1080??
Thanks
If advertising were
If advertising were honest…
GTX1080 Fanboi Edition : A 16nm Maxwell 2.5 ES fan heater with the nuts clocked off it ! Only $700 !!*
*Terms and conditions apply. Limited supply, no async, won’t OC as we suggested, purchase of G-Sync mandatory (thus locking you into whatever money grabbing scheme we can dream up next), ‘Gamehardlyworx’ included, second hand values likely to be that of second hand toilet paper very shortly. No returns, no refunds. Earplugs included.
Oh, and it’s shiny, so there is that.
nvidia say day have in 1080
nvidia say day have in 1080 Contras above 1: 10,000 Why you have not checked it?