Power Consumption, Sound Levels and NVIDIA Variances
Performance and clocks are not the only things to change when you adjust the maximum fan speed on the R9 290X. Power consumption and noise levels also increase as a result of it.
Power consumption was measured during Run 2 of this scenario, when the GPU was at its hottest. Clearly as the GPU uses higher clocks, the power draw is higher as well. At the 20% fan speed setting, which kept the GPU clock at 727 MHz most of the time, the R9 290X system was drawing 317 watts. That is 37 watts lower than the power draw in the "reference" 40% fan speed setting and 119 watts lower (!!) than the power draw of the card at the 60% fan speed setting.
It is also worth nothing that the maximum performance setting of 60% fan speed results in a 23% increase in power consumption (whole system) compared to reference; a difference of 82 watts is not insignificant.
As fan speed increases, so does the noise level from the R9 290X blower style cooler. At the reference setting we were getting 42.6 dbA which jumps to 53.4 dbA with the maximum performance mode of our Hawaii GPU. Make no mistake, this card can get loud. Compared to the current GeForce GTX options on the market, even at the default settings, the R9 290X is louder.
NVIDIA GeForce Comparison
Although this article really focuses on the claims that AMD is making about Hawaii, I know that readers would like to see NVIDIA Kepler included in some way. NVIDIA introduced GPU Boost with Kepler in early 2012 and was actually criticized by many at the outset for offering too much variability in clock speed. They did not specifically give a clock rate that the GPU would run at for all applications. This was a very new concept for us at the time: GPUs that did not run at a single clock rate were confusing and scary. Obviously both parties involved now see the benefits this option can offer.
These graphs compare a high-end GeForce GTX graphics card to the R9 290X at 40% fan setting in terms of clock rates. Performance isn't measured here, directly, although these are comparable parts.
During the Run 1 results, both GPUs are able to maintain near-maximum clock speeds, as you would expected based on previous data. The Run 2 graph shows the same 40% fan speed we results for the 290X on the previous pages but it also shows how the GeForce GTX GPU reacts. The NVIDIA card also drops in clock speed (from about 1006 MHz to 966 MHz, 4%) but it maintains that level throughout. That frequency is also above the advertised base clock.
This is only a single game, and single level, being tested but my experiences with both AMD's and NVIDIA's variable clock implementations show this is a standard scenario.
Closing Thoughts
There are two key issues worth looking at from this story. The first issue is the notion that the R9 290X is an enthusiast configurable graphics card and the second is the high variability in the clocks of the 290X (even at default settings).
Let me first comment on the clock variability. I said in my initial 290X review I wrote:
Notice that not one of them (average clock speeds we reported) actually averaged over 900 MHz (though Skyrim got very close) and Bioshock Infinite actually was the slowest at 821 MHz. That is an 18% drop in frequency from the "up to" rating that AMD provided us and provided add-in card vendors.
…
I think AMD needs to come up with a frequency rating that is more useful than "up to" for the R9 290X as it misleading at best. NVIDIA adopted a "typical Boost clock" for its GPUs that seems to have worked and also indicates a "base clock" that they guarantee the GPU will never go below during normal games. R9 290X users should want the same information.
After this further research, I firmly believe that AMD needs to take charge of the situation and help present relevant performance data to consumers. Imagine the variance we'll see once retail cards start showing up in the market with better (or worse) coolers? Will AMD allow its partners to just claim some wildly higher "up to" clock rates in order to catch the enthusiast who did not come across the results we have presented here? The R9 290X, empirically, has a base clock of 727 MHz with a "typical" clock rate in the 850-880 MHz range. "Up to 1000 MHz" has very little relevance.
AMD's claim the that the R9 290X is an "enthusiast configurable GPU" does have some merit though. I would agree that most users purchasing a $500+ graphics card will have a basic knowledge of the control panel and its settings. Gamers will be able to get some extra performance out of the R9 290X by simply increasing the maximum fan speed as long as they are comfortable with the increased sound levels.
What you cannot do though is lower the maximum fan speed any lower than 40% to get a quieter solution than stock. With settings of 20% or 30%, the fan ignored my preference to maintain 95C at the 727 base clock. Hawaii appears to be pegged at that base level until AMD's partners can come up with quieter and more efficient cooling solutions.
Have questions? Thoughts? Leave them in the comments below!!
Alright, call me crazy, but
Alright, call me crazy, but is there anything at all scientific about setting a fan speed at 40%?? Not one, just simplicity. How about setting the fan RPMs or the fan decibels for something actually performance related. 40% is arbitray between two different fans. The size and blade pitch could dictate wildly differing CFM values.
Regardless, its a pretty crazy comparison, when its clear AMD put almost zero effort into the reference cooler. They planned to give consumers a good value, not the best of everything. Its loud, we all get it, but how about an actual useful review of someone strapping a water block on it, or another cooler? How many first adopters of $500 hardware leave well enough alone? If you buy 290X cards you know what you are buying and likely have water cooling, so why pay more money for a great cooler you will take off?
In this way it is also configurable performance in FPS, temps and sound levels.
I am not sure what to make of
I am not sure what to make of my R9 290 to be honest. My waterblock arrives tomorrow so my opinions are likely to change, however from the testing that I have done with this card over the last two weeks I can say that overclocking is totally pointless with stock cooling. My average clocks in BF4 with fan on auto is around 820mhz with a low of 720mhz… I quickly hit 94C. The latest 13.11 beta 9.4 catalyst drivers have no performance tab on my test machine (known bug), so I’m not bothering to test with power limit increased until this is fixed and my waterblock is installed.
I am sure that once adequately cooled this problem will disappear.
Still I determine that AMD should be blamed for shoddy marketing and a lousy stock HSF.
They should have been honest enough to state that the clocks on this card are 800mhz boosted to 950mhz (depending on heat and power)…. Just my opinion.
I sold my GTX 780 for a stock R9 290 and believe that this was a good deal. I got $500 for my GTX 780 and this allowed me to buy the R9 290 and a new EK Waterblock. Fully cooled I will be able to get better performance from the R9 290 and as I only play BF4 I am excited to see what Mantle brings to the table later in December.
Nice review as always.