Evaluating the ASUS GTX 960 Strix
One of the first responses from the AMD community when we posted our story about the power draw concerns on the new Radeon RX 480 was to point out that apparently the ASUS GeForce GTX 960 Strix card had a similar issue. Keep in mind that the GTX 960 reference specifications put the TDP of this product at 120 watts, 30 watts lower than the RX 480, though it still utilizes a single 6-pin power connection in addition to the motherboard supplied power.
Seriously…
I have an ASUS GTX 960 Strix so I wanted to see if the claims made were accurate. I fired up our advanced power testing hardware this morning and ran through the worst-case tests we had come up with over the last 48 hours, including running Metro: Last Light at 4K in both stock and overclocked settings.
Metro: Last Light (4K) power draw, GTX 960 Strix, Click to Enlarge
At stock clock speeds under Metro: Last Light at 4K, the total power draw on the GTX 960 Strix card never exceeds 110 watts, motherboard supplied power never exceeds 30 watts and the 6-pin PCI Express power cable never exceeds 80 watts. This is interesting – even though the 6-pin cable is technically rated at just 75 watts, ASUS decided that rather than draw more power over the motherboard interface it would prefer to depend on the over-built power delivery of the power supply itself.
Metro: Last Light (4K) power draw, GTX 960 Strix OC, Click to Enlarge
I decided it crank things up slightly by adding in the maximum voltage increase to the card through Precision X software and then setting the GPU clock offset to +93 MHz. The resulting total power increases to over 115 watts, while the motherboard power increases to nearly 35 watts and the 6-pin power cable jumps to peaking at almost 85 watts.
Metro: Last Light (4K) power draw, GTX 960 Strix OC, Click to Enlarge
Zooming in on the power data shows us a very stable, low variance power delivery system on the GTX 960 Strix card. Again, it is worth noting that the 6-pin power connection is definitely drawing more than the 75 watts that it is technically rated at, but that the motherboard +12V power delivery is still well within the defined 66 watt limit we discussed earlier.
Rise of the Tomb Raider (1080p) power draw, GTX 960 Strix OC, Click to Enlarge
Just as another data point, here is some captured power delivery with Rise of the Tomb Raider with the GTX 960 Strix card in its overclocked state. Total power draw doesn't exceed 105 watts and the power through that PCI Express slot on the motherboard stays at 30 watts or less.
Metro: Last Light (4K) voltage and current, GTX 960 Strix, Click to Enlarge
With our ability to measure voltage and current, we return back to Metro: Last Light at 4K to find that amperage draw over the motherboard's +12V line stays right at 2.5A. That is a drastic difference compared to the RX 480 hitting more than 7A over the same line, especially considering the 5.5A limit from the PCI Express specification.
So there you have it – while I cannot say for certain that NO previous graphics card in recent memory hasn't behaved in the same fashion that the new AMD Radeon RX 480 does, I can categorically discount the notion that the ASUS GTX 960 Strix is somehow equivalent in its power delivery. I know that many of you still look at the spike wattage output numbers provided by the Tom's Hardware testing methods, but I encourage you to re-read what I posted on the first page of this story:
One interesting note on our data compared to what Tom’s Hardware presents – we are using a second order low pass filter to smooth out the data to make it more readable and more indicative of how power draw is handled by the components on the PCB. Tom’s story reported “maximum” power draw at 300 watts for the RX 480 and while that is technically accurate, those figures represent instantaneous power draw. That is interesting data in some circumstances, and may actually indicate other potential issues with excessively noisy power circuitry, but to us, it makes more sense to sample data at a high rate (10 kHz) but to filter it and present it more readable way that better meshes with the continuous power delivery capabilities of the system.
Some gamers have expressed concern over that “maximum” power draw of 300 watts on the RX 480 that Tom’s Hardware reported. While that power measurement is technically accurate, it doesn’t represent the continuous power draw of the hardware. Instead, that measure is a result of a high frequency data acquisition system that may take a reading at the exact moment that a power phase on the card switches. Any DC switching power supply that is riding close to a certain power level is going to exceed that on the leading edges of phase switches for some minute amount of time. This is another reason why our low pass filter on power data can help represent real-world power consumption accurately. That doesn’t mean the spikes they measure are not a potential cause for concern, that’s just not what we are focused on with our testing.
We are still working with AMD to learn about the solution to this power problem and I hope to have an answer later today. For now, we are looking around the office for some older, less expensive systems and motherboards to use to try and duplicate any system failures (shutdowns, etc.) that we have been reading about online.
unofficial
unofficial fix:
http://www.overclock.net/t/1604979/a-temporary-fix-for-the-excess-pci-e-slot-power-draw-for-the-reference-rx-480-cards
So it turns out the power drawn by the 6-pin from the psu and the power drawn from the pci-express slot can be reallocated and does not have to be split equally between the two, as some youtube videos suggest.
Great news if you don’t feel the need to trash AMD and just want more competition and better and better gpu’s, regardless of what fucking company is making it.
Yes, it puts more stress on the 6-pin, but that connection has plenty of headroom according this and many other sites. In addition, non-reference cards are sure to have 8-pin connectors if needed.
Just going to say the same
Just going to say the same thing. AMD goofed by leaving the Voltage Regulator chip set to its default settings, causing the power to come equally from both the PCI slot and the 6 pin power connector. The solution is to pull more from the power connector (which should handle it without a problem) but will heat up the VRMs more for the 3 that will have to handle the extra burden. AMD will probably send out a BIOS update and let the users flash the BIOS to fix the issue. Bad PR issue but luckily it is not the disaster I was afraid it was going to be. DON’T overclock the card and don’t run any benchmarks until the fix is in place. Perhaps it is better just to put in your old card for the week or so until AMD gets the proper settings out to the public.
How come all the other sites
How come all the other sites have announced the driver fix and pcPer has not as yet? Seems odd considering how much effort was put into making sure the world new about the shit AMD invited onto themselves. It just seems fair to be as quick about announcing the fix AMD has proposed.
Now I get why some people accuse Ryan of favoring Nvidia. I always thought he was fair, but this does not seem so.No doubt pcPer will post something about it soon, but it is already 1400 CST and they knew about it early this am. They certainly would ridicule AMD if they didn’t respond to one of their accusations in short order. How come it doesn’t work both ways. Just seems odd, as i said.
they have to cash the check
they have to cash the check from Nvidia first.
Roger that. It is now 1532
Roger that. It is now 1532 CST and still no post. I am starting to believe it.
We literally mentioned that
We literally mentioned that there would be an announcement about the new driver addressing that back on the 30th … so ya, already covered.
The only new info is that the name will be Radeon Software 16.7.1 and that ain't really enough for me to post about until it actually launches.
they have to cash the check
they have to cash the check from Nvidia first.
they have to cash the check
they have to cash the check from Nvidia first.