Detailed Power Consumption Testing
When we started dissecting the power consumption concerns around the Radeon RX 480, that have since been mostly addressed by AMD with a driver fix and new control panel option, I knew this meant a much more strenuous power testing process going forward.
How do we do it? Simple in theory but surprisingly difficult in practice, we are intercepting the power being sent through the PCI Express bus as well as the ATX power connectors before they go to the graphics card and are directly measuring power draw with a 10 kHz DAQ (data acquisition) device. A huge thanks goes to Allyn for getting the setup up and running. We built a PCI Express bridge that is tapped to measure both 12V and 3.3V power and built some Corsair power cables that measure the 12V coming through those as well.
The result is data that looks like this.
What you are looking at here is the power measured from the GTX 1080. From time 0 to time 8 seconds or so, the system is idle, from 8 seconds to about 18 seconds Steam is starting up the title. From 18-26 seconds the game is at the menus, we load the game from 26-39 seconds and then we play through our benchmark run after that.
There are four lines drawn in the graph, the 12V and 3.3V results are from the PCI Express bus interface, while the one labeled PCIE is from the PCIE power connection from the power supply to the card. We have the ability to measure two power inputs there but because the GTX 1080 only uses a single 8-pin connector, there is only one shown here. Finally, the blue line is labeled total and is simply that: a total of the other measurements to get combined power draw and usage by the graphics card in question.
From this we can see a couple of interesting data points. First, the idle power of the GTX 1080 Founders Edition is only about 7.5 watts. Second, under a gaming load of Rise of the Tomb Raider, the card is pulling about 165-170 watts on average, though there are plenty of intermittent, spikes. Keep in mind we are sampling the power at 1000/s so this kind of behavior is more or less expected.
Different games and applications impose different loads on the GPU and can cause it to draw drastically different power. Even if a game runs slowly, it may not be drawing maximum power from the card if a certain system on the GPU (memory, shaders, ROPs) is bottlenecking other systems.
First, let’s look at our total power draw numbers.
With both cards listed with a 250 watt TDP, it makes sense that the GeForce GTX 980 Ti and the new Pascal-based Titan X would use essentially the same power in our game testing. In this testing run of Rise of the Tomb Raider at 2560×1440, the power draw has some interesting "waves" to it that do not exist on the GTX 980 Ti or even the GTX 1080. Total power draw doesn't quite reach 250 watts.
In the Witcher 3, the Titan X is using as much power as the GTX 980 Ti in the early portion of our test run, and as much as the AMD Fury X on the later half. The waves still exist, but they don't seem to indicate any specific power issues. There are a couple of brief instances where the power draw is just over 250 watts.
How does the power distribution break down between the motherboard slot and 8-pin/6-pin power connection with the new Titan X?
One of the worst case scenarios for power draw we saw with the Radeon RX 480 was in Metro: Last Light running at 4K. With the new Titan X we see a couple of interesting things: first the power spikes over 250 watts exactly one time in our testing. The 8-pin power connection uses under 150 watts and the 6-pin connection uses less than 70 watts. From the motherboard, the Titan X never goes over 50 watts.
This graph is busy for sure, but it shows voltage and current draw for all three power sources on the Titan X. The important data point is that the current through the motherboard slot stays below 4.5A.
For those of you that dive into overclocking, how does that change power draw? I ran the same Metro: Last Light testing at 4K with the Titan X running at a +150 MHz offset.
Power draw increases as we expected with our GPU power target increased by 20%. The new settings take our card up over 320 watts a few times, but it seems to stabilize around 300 watts or so. Power through the motherboard slot exceeds 66 watts and hits as much as 6.0A of power draw. That is definitely over the PCI Express spec, but since we are in an overclocked state, it isn't alarming. Still, it's good to know where things stand with this new flagship graphics card.
Ryan, too bad there is no
Ryan, too bad there is no benchmark of other games such as Six Siege, Overwatch etc.
Think that many of these games is crucial to people if they are seriously buying this GPU.
I know you can’t benchmark every game, but should atleast go for the most popular ones.
Ryan, first off – THANK YOU!
Ryan, first off – THANK YOU! You and your staff did an outstanding job putting together this review. I can easily see there was a lot of work done here. I also understand that not every review is perfect, so I may be a bit more forgiving with regards to any mistakes made – though I really didn’t see any and you approached this review with a ‘just the facts’ mentality. Some things I would like to make note of based upon the information and data provided within your Titan X review, other reviews I have read thus far, and my current PC hardware configuration (two GTX980Ti cards in SLI):
1) Titan X performance is near the performance of two GTX980Ti cards in SLI, let alone two GTX980 cards, which by the way can’t even achieve correct playable frame rates at 4K resolutions due more to the limitation of the VRAM (only 4GB each card).
2) Almost 50% of games today do not scale well with 2-way SLI. 3-way and higher is even worse. This alone is a valid argument for those seeking the best performance without all the technical issues that SLI induces to buy a Titan X. Using a single card means no micro stutter, frame rendering lag, required need for a SLI HB bridge, and of course the fact that double the performance is not achieved in 99% of games currently on the market.
3) A single GTX1080 can not play a vast majority of games at 4K resolutions without having to turn down some settings, and buying two GTX1080 cards to do so will cost you as much as a single Titan X AND you will still have the issues induced by SLI, especially more so with DX12 games.
Based upon these observations, one would conclude that if you are an avid enthusiast PC gamer and play games at the higher resolutions, the Titan X is the best buy for the returned level of performance and least amount of technical issues and limitations associated with running two or more cards in an SLI configuration. One could also argue that if there were a need to ‘grow’ in performance capability, then worst case you could always add a second Titan X card 😉
FP16 performance in GP102 is
FP16 performance in GP102 is just 1.5% of FP32 performance.
This year GPUs will be up to twice as efficient!