Detailed Power Consumption Testing

When we started dissecting the power consumption concerns around the Radeon RX 480, that have since been mostly addressed by AMD with a driver fix and new control panel option, I knew this meant a much more strenuous power testing process going forward. 

How do we do it? Simple in theory but surprisingly difficult in practice, we are intercepting the power being sent through the PCI Express bus as well as the ATX power connectors before they go to the graphics card and are directly measuring power draw with a 10 kHz DAQ (data acquisition) device. A huge thanks goes to Allyn for getting the setup up and running. We built a PCI Express bridge that is tapped to measure both 12V and 3.3V power and built some Corsair power cables that measure the 12V coming through those as well.

The result is data that looks like this.

What you are looking at here is the power measured from the GTX 1080. From time 0 to time 8 seconds or so, the system is idle, from 8 seconds to about 18 seconds Steam is starting up the title. From 18-26 seconds the game is at the menus, we load the game from 26-39 seconds and then we play through our benchmark run after that.

There are four lines drawn in the graph, the 12V and 3.3V results are from the PCI Express bus interface, while the one labeled PCIE is from the PCIE power connection from the power supply to the card. We have the ability to measure two power inputs there but because the GTX 1080 only uses a single 8-pin connector, there is only one shown here. Finally, the blue line is labeled total and is simply that: a total of the other measurements to get combined power draw and usage by the graphics card in question.

From this we can see a couple of interesting data points. First, the idle power of the GTX 1080 Founders Edition is only about 7.5 watts. Second, under a gaming load of Rise of the Tomb Raider, the card is pulling about 165-170 watts on average, though there are plenty of intermittent, spikes. Keep in mind we are sampling the power at 1000/s so this kind of behavior is more or less expected.

Different games and applications impose different loads on the GPU and can cause it to draw drastically different power. Even if a game runs slowly, it may not be drawing maximum power from the card if a certain system on the GPU (memory, shaders, ROPs) is bottlenecking other systems.

First, let’s look at our total power draw numbers.

Click to Enlarge

The GeForce GTX 1080 Ti runs closer to the 250 watt TDP rating than the Titan X or even the GTX 980 Ti before it in Rise of the Tomb Raider at 2560×1440. The variance takes it above and below that line consistently and repeatedly, with a likely culprit of high clock speeds and faster G5X memory (11 Gbps vs 10 Gbps). That ~35% performance advantage of the GTX 1080 Ti over the GTX 1080 comes at the cost of 38% additional power draw.

Click to Enlarge

The result is very similar in The Witcher 3 with the GTX 1080 Ti drawing more power than the Titan X before it, despite having the idential TDP rating.  

As for the potential for overdraw from any single source of power, how does the power distribution break down between the motherboard slot and 8-pin/6-pin power connection with the GTX 1080 Ti?

Click to Enlarge

One of the worst case scenarios for power draw we saw with the Radeon RX 480 was in Metro: Last Light running at 4K. With the GeForce GTX 1080 Ti we see a couple of interesting things: first the power spikes over 250 watts quite a bit in our testing but seems centered just below that point. The 8-pin power connection uses under 150 watts and the 6-pin connection pulls just over 70 watts (but consistently under 80 watts). From the motherboard, the Titan X never goes over 55 watts.

 

For those of you that dive into overclocking, how does that change power draw? I ran the same Metro: Last Light testing at 4K with the Titan X running at a +150 MHz offset.

Click to Enlarge

Power draw goes from an averageof 250 watts up to 290-300 watts, an increase of 20% or so. The added juice seems to be coming from the 6-pin and motherboard PCI Express connections as those increase to ~85 watts and ~60 watts respectively. The 85 watt draw from the 6-pin connection is slightly over spec but the 60 watts from the motherboard PCIe connection is under the rated 66 watts, where the most danger would lie.

« PreviousNext »