Clock Speeds and Power Consumption
Clock Speeds and Clock Consistency
As we start to dive into the performance of the card itself, I like to get a better understanding of the behavior of the product itself. For GPUs this means looking at the reported clock speeds, and how stable they appear to be over time. I turn to a long run of Unigine Heaven benchmark for this.
In general, the clock speeds on the Vega Frontier Edition appear to hover around the 1440 MHz mark, well under the 1600 MHz rated clock speed of the card from specifications. The jumps hit inside the 1300 and 1500 MHz marks though the granularity of the shifts are quite large, indicating that maybe the boost/turbo capability of the Vega architecture remains more limited than what we see on the GeForce side of things.
I also ran a test with the fan speed cranked up to 3000 RPM or so, fixed, to see how keeping the GPU cooler would affect the ability for the card to hit higher clock speeds. As you can see, while the higher fan speed produced one fewer step down into the 1300 MHz range, we did NOT see it go any higher than the 1528 MHz peaks you see in both results.
Compared to previous generations of hardware from AMD, the Vega FE is running at impressive clock speeds. The Radeon Fury X was only able to top out at the 1050 MHz (reference) with the same stream processor count. (Note however than the compute units are different from Fiji to Vega.) Even the Polaris GPUs used on the RX 580 were topping out at 1340 MHz for reference speeds.
Detailed Power Consumption
With a rated TDP of 300 watts, we had a lot of questions about the power consumption of the Radeon Vega Frontier Edition. Based on our direct power consumption measurement (not at the wall), we are able to get precise numbers.
In general, the power draw of the Vega FE is under the 300-watt level, which is good. After the issues that haunted the Radeon RX 480 at its launch (drawing well over the 150 watts rated TDP), AMD has learned its lesson it appears. Power draw from the PCIe connection on the motherboard stayed very low actually (near 25 watts) with the vast majority of the power coming from the dual 8-pin power connections on the card. For reference, that power draw of 300 watts is 50 watts higher than the GeForce GTX 1080 Ti and the Titan Xp but 120 watts higher than the GeForce GTX 1080.
We did find one odd behavior with the power draw on the Radeon Vega FE card that showed itself when the GPU got up to a peak temperature of 85C – power draw would dip to a lower level. And when I say level, I mean exactly that: the card appears to shift into a lower power draw state for varying amounts of time as it attempts to regulate the temperature of the GPU. Look at the latter part of the RoTR power results above – the blue line of the Vega FE power consumption clearly has a back and forth step to it.
In our power testing with Metro: Last Light at 4K you can clearly see two different steppings below the “nominal” power draw level of 280+ watts. At 240 watts and 200 watts, the card was pulling less power, but interestingly, did NOT appear to drop the clock speeds during those intervals. That seems counter-intuitive to be sure, and we are still asking questions of AMD to figure out what is going on. This behavior was repeatable through multiple runs and through different games – we saw it in both Witcher 3 and Rise of the Tomb Raider.
We ran a quick experiment to make sure it was thermals we were dealing with. By cranking the GPU fan up to 100% (which is quite loud) we could play Metro: Last Light at 4K without seeing the power draw drops. The GPU temperature never breached 56C during that testing run so it is likely we could lower the GPU fan speed to something more reasonable on our ears while still keeping the card out of that 85C range that appears to cause the power draw throttling. I am hoping that AMD takes heed of this and manages the fan curves differently with the RX Vega product line.
Adding this stupid, manually
Adding this stupid, manually switchable gaming mode was a terrible, TERRIBLE marketing idea. This will take a big fat chunk out of the RX sales.
…oh yes, and great review
…oh yes, and great review as usual…ahem. 🙁
Terrible? It costs much more
Terrible? It costs much more than RX. So it wouldn’t be able to affect RX sales.
Well, good thing that you
Well, good thing that you didn’t make it your final review, however..
Vega performance rather is fishy more than anything.
It barely performs better than a Fury X with drastic increases in clocks as the Fury X would outright beat that with such clocks and is an older generation with older computing even if it was to just be a die shrink, those NCU’s are not even showing their prowess, we heard many times that those are probably Fiji drivers and that TBR is disabled therefor VERY bad to judge from as of yet.
I would take all of this as a grain of salt, Polaris to Hawaii shown it’s improvements by having less SU’s and half less the ROP’s with slightly higher clock speeds as we still have much less rendering yet better performance.
The power figures are actually pretty good considering how highly clocked it is compared to the old Fury X and still consume nearly the same if not slightly more than it.
It’s pretty much like they are sand bagging at this point, or it would look like they are because people are reviewing an incomplete piece of technology and ruining it’s marketing for it, there’s a reason why AMD didn’t send samples to anyone and despite not doing it, because reviewers are doing it like this, it might have killed some people interest into buying it later because the majority are pretty ignorant in the first place.
Final thoughts on the matter though, I’m sure they shipped Vega as it is like this because optimizing productive software is much less of a hassle compared games thus, they started to want making profits earlier, from the results of certain software we can see that Vega performs extremely well in some and lose in others, to be quite fair against it’s competitor, it looks like a very compelling card and we’ve just got to have benchmarks in a very early stage that we shouldn’t have.
Please, just don’t think Vega is a flop yet with all those factors in place.
Good write up Ryan
well done
Good write up Ryan
well done on putting out some clarification points to the obvious questions that always arise with these new products. This is a beast of a GPU with all sorts of new hardware goodness in there from the materials used right down to the finished product it oozes quality and quantity. I can see this card retailing for around $1500 in Australia.
Nice any chance you can run
Nice any chance you can run the DeepBench that AMD showed off before.
Why does the chart wrongly
Why does the chart wrongly show the 1080Ti as not having the same G5X interface as Titan Xp and 1080 vanilla? Mistake.
To me, the most important
To me, the most important question is, is this card going to support 10bit output for openGL, for work in Photoshop. Because if I have a nice 10bit monitor (like BenQ SW320) I would like to use 10bit color in Photoshop. But Photoshop usually requires you to have Quadro or FuryPro to use 10 bit color. What about this card?
Thanks!
Very disappointing.
Expensive
Very disappointing.
Expensive and slow?
I have just registered to
I have just registered to tell you how much I liked your review.
I simply can’t believe Vega is just a die-shrink of Fiji. Which would be completely odd, because they could have at least used Polaris as a basis. My bet is also on a lot of features not turned on yet. I have a suggestion to make, to see if that assumption is true: could you test Deus Ex: Mankind Divided on both Vega and a Fiji card, possibly even at the same clocks? We have seen a demonstration of the HBCC in that game and if the performance level would be the same on both cards are similar clocks, it would appear to me a driver issue.
Would you be able to test
Would you be able to test Prey @ 4k? Raja said 2 RX Vega were running above 60 FPS. Curious to see how well FEs run compared to RX Vega.
jul 2016
AMD puts massive
jul 2016
AMD puts massive SSDs on GPUs and calls it SSG
https://semiaccurate.com/2016/07/25/amd-puts-massive-ssds-gpus-calls-ssg/
Ryan – Can you ask if these
Ryan – Can you ask if these will work for VDI on Server 2016? (or for that matter any of AMD’s S series cards) and where to get drivers for Server 2016?
AMD doesn’t seem to offer drivers for Server 2016, but they advertise their VDI cards as working with Windows Server. I would like to know if it works for Remote FX or Descrete Device Assignment for GPU’s and what kind of performance one can expect from that configuration.
Thanks.
Many thanks PC Perspective
Many thanks PC Perspective for putting this awesome bench marking piece together!
One thing I’m really wondering about is the cards performance in Folding@home. As a heavy folding@home donor I am in need of upgrading my build and am seriously considering Vega as my upgrade path. If you do a follow-up piece any chance you could throw in some Folding@home benchmarking?
If anyone can comment on
If anyone can comment on Folding@home performance that would be appreciated.
Depending on the workload, as
Depending on the workload, as an eGPU, which shouldn’t make much difference since F@H is such low bandwidth, I am getting 65k to 71k PPD. Am I disappointed? Yes. Is it way better than my W7000 I was using before, for sure. Luckily I don’t use this card for a F@H server farm. If I were pulling 100k PPD or close to it, I would be more incentivize to leave my computer on churning out folds 24/7.
Can’t believe you’re
Can’t believe you’re defending AMD for calling it “not a gaming card”. That’s a load of BS. And the Titans are not gaming cards! They are for deep learning. How much is AMD paying you to write this propaganda?
make a test undet DX 12….
make a test undet DX 12…. and not DX 11. newbies.
and you will see how nvidia sucks against VEGA