Clock Variations
AMD claims that the variations we see in clock speeds with the R9 290X are part of what makes the GPU great. Are they right?
When AMD released the Radeon R9 290X last month, I came away from the review very impressed with the performance and price point the new flagship graphics card was presented with. My review showed that the 290X was clearly faster than the NVIDIA GeForce GTX 780 and (and that time) was considerably less expensive as well – a win-win for AMD without a doubt.
But there were concerns over a couple of aspects of the cards design. First was the temperature and, specifically, how AMD was okay with this rather large silicon hitting 95C sustained. Another concern, AMD has also included a switch at the top of the R9 290X to switch fan profiles. This switch essentially creates two reference defaults and makes it impossible for us to set a baseline of performance. These different modes only changed the maximum fan speed that the card was allowed to reach. Still, performance changed because of this setting thanks to the newly revised (and updated) AMD PowerTune technology.
We also saw, in our initial review, a large variation in clock speeds both from one game to another as well as over time (after giving the card a chance to heat up). This led me to create the following graph showing average clock speeds 5-7 minutes into a gaming session with the card set to the default, "quiet" state. Each test is over a 60 second span.
Clearly there is variance here which led us to more questions about AMD's stance. Remember when the Kepler GPUs launched. AMD was very clear that variance from card to card, silicon to silicon, was bad for the consumer as it created random performance deltas between cards with otherwise identical specifications.
When it comes to the R9 290X, though, AMD claims both the GPU (and card itself) are a customizable graphics solution. The customization is based around the maximum fan speed which is a setting the user can adjust inside the Catalyst Control Center. This setting will allow you to lower the fan speed if you are a gamer desiring a quieter gaming configuration while still having great gaming performance. If you are comfortable with a louder fan, because headphones are magic, then you have the option to simply turn up the maximum fan speed and gain additional performance (a higher average clock rate) without any actual overclocking.
PowerTune Update
This is now possible thanks to the updated AMD PowerTune technology. It no longer creates a fixed "boost" clock for the Hawaii GPU but instead the algorithm is controlled by multiple parameters to vary performance. AMD has set a thermal limit of 95C on the R9 290X and thus fan speeds, clock speeds, and voltages adjust to maintain that maximum temperature. Obviously, with that, performance will follow.
Now you can adjust clock speed and power levels in the driver as well, but for the sake of this conversation and story we are leaving that out to see what kind of flexibility you get with just fan speed. PowerTune has a lot more functionality (and a lot more potential upside) but this, now, creates some questionable performance variances.
What AMD is doing here isn't unique of course; NVIDIA's Kepler architecture introduced GPU Boost technology in early 2012 and Intel's Turbo Boost technology on the CPU side attempts to do something similar as well. But with those two technologies we generally have well defined levels of scaling to expect, and the clock speed deltas are fairly low. With the latest PowerTune from AMD on the 290X, that isn't the case.
The purpose of this story then is two fold. First, I wanted to see how flexible the PowerTune technology was and how configurable the R9 290X card and the Hawaii GPU actually is. Does it live up to AMD's claims that this is a feature and not simply a desire to maintain yields and profit margins? Second, we wanted to know how much performance you can actually gain by increasing the maximum fan speed along with how much performance you lose by decreasing the maximum fan speed?
As it turns out there is quite a lot of variance. This, of course, leads to quite a few topics we will discuss as we go through the data.
- AMD Radeon R9 290X 4GB – $549 (Newegg.com)
- AMD Radeon R9 280X 3GB – $299 (Newegg.com)
- NVIDIA GeForce GTX TITAN 6GB – $999 (Newegg.com)
- NVIDIA GeForce GTX 780 3GB – $649 (Newegg.com)
Testing Setup – Very Important!
You will see we are doing TWO RUNS for our benchmarking. You will see labels for "Run 1" and "Run 2". Run 1 is very simple. We start with a cold GPU, making sure the GPU temperature is in the low 40C area at the desktop. We open up GPU-Z and begin recording data (this captures our clocks, temperatures, fan speeds, etc.). As quickly as possible we start our benchmark (Crysis 3 in this case) and load into the testing area as soon as possible, usually within 30 seconds. I then immediately do our standard 60 second benchmark run through.
Then I just play the game for 5-7 minutes. Not idling, not at a menu, actively playing the game. This heats up the GPU and gets it into a real world scenario. I then reload the map after the minimum of 5 minutes has passed and do the exact same 60 second benchmark run through, capturing our results through our Frame Rating methodology. That is Run 2.
So if you see differences between Run 1 and Run 2 results you are seeing the difference between a cold GPU benchmark and a warm GPU benchmark. Or, this is what we consider "unfair" performance levels against actual, real-world performance levels.
We are only using Crysis 3 here because it is the hardest single game on GPUs today. This phenomenon does occur with all games, to some degree, and mostly in this dramatic fashion.
Leaving everything else untouched, we tried out 5 different fan speed settings.
Using the Catalyst 13.11 V8 driver, I ran this same scenario three times at five different fan speed settings. 20%, 30%, 40%, 50% and 60%. 40% is the default "quiet" setting for the Radeon R9 290X while 55% is the "Uber" mode that AMD offers through the dipswitch on the top of the reference designs. I wanted to see if I could get the card to an even lower noise level with the 20% and 30% options and then push the fan speed up to a very high, and loud, level to see what additional performance users could get out of Hawaii by sacrificing sound levels.
I think the results are going to be quite interesting.
Slap a waterblock on it and
Slap a waterblock on it and it will be both quiet and and superfast. Problem? 🙂
I agree. AMD should have
I agree. AMD should have shipped this thing watercooled in its own closed loop.
The price wouldn’t remain the
The price wouldn’t remain the same and would ruin the whole value / enthusiast advantage of the card.
You do realize that a part of
You do realize that a part of the card’s price is that huge cooler, right? A semi decent self contained loop would be maybe 50-60$ more than the current price and the performance would be stellar all the time. If done right, it wouldn’t move away from the 1GHz core clock for a second and could also be almost silent at that…
Actually the coolers are
Actually the coolers are around $25~30 for a standard 290(X)
But for me the benefits of Water cooling out strip the cost, and the cost is less high then most people think, as most of the time, you get a 10~20% extra overclock, and also a more silent system.
As some people and friends call me crazy, for having a +€1000 water loop whit 4x +2x 140mm 80mm thick rads and 2x 180mm rads in my TJ11, but when i went over from quad 5870 2GB Matrix cards on a R4E mobo whit a 3930K@5,1GHz, i just had to replace the VGA waterbloks, for two new Titan blocks for €250.
But the rest of my loop i will properly use for maybe a other 10 years.
For all that money i got back a +/-10% extra overclock, but above all, even when i had 4 cards in my system and was pushing around 1000 watts, i still could not hear my system.
That would be an option 🙂
That would be an option 🙂 But I’d rather have a good quality block on it prepared for my own loop. I don’t really like those self contained thingies 😉
Or why not ship it with no
Or why not ship it with no cooler at all and give the owners that price cut for buying their own after market coolers
Or buy a GTX 780Ti and be
Or buy a GTX 780Ti and be done with it.
And put a hole on your pocket
And put a hole on your pocket as well ^_^
The same hole you get by
The same hole you get by buying aftermarket cooler/ waterblock…It comes down to preferences
yeah.. not all users have
yeah.. not all users have liquid cooling or would plan to get one. Problem?
” Will AMD allow its partners
” Will AMD allow its partners to just claim some wildly higher “up to” clock rates in order to catch the enthusiast who did not come across the results we have presented here?”
Good Catch
AMD is again blurring the performance lines with marketing propaganda, you can’t trust a debt laden desperate company. Nvidia’s Titan class coolers are far better that AMD stock cooler. Their GPUs run cooler and more consistent without all the blower noise. Add in better drivers, SLI, control panel, CUDA, PYSX, and visual quality and Nvidia is all I’ve been recommending.
You get what you pay for with Nvidia, with AMD you get buyers remorse from the bogus benchmarks you got duped into.
I do have to support both so what’s the best way to install a new AMD GPU driver, it’s been nothing but problems for me. Nvidia driver updates work like charm, maybe I would’n be so hard on AMD if at least that was made easy. About 2/3 of the GPUs I support are Nvidia but about 2/3 of the driver problems are from AMD.
The AMD driver excuse is
The AMD driver excuse is getting a bit old. That was from many years ago and they have come light years from that experience.
Nvidia has had more driver issues recently that caused catastrophic failure on their cards.
Comparing the $550 R9-290x to the $1000 Titan and talking about quality is redundant. For less than a $1000 Titan you can install a water cooling solution that would run circles around the Titan with noise temperature and performance.
Even if you prefer Nvidia, you want this card because it brought down the price of the 780 a ton.
I am getting really tired of
I am getting really tired of this AMD drivers suck thing. BOTH companies have good and bad drivers. nVidia drivers kill their cards at times, never heard of a suicidal Radeon…
To the point, both teams suck on driver front – that is a given. But, for the past 2-3 years I personally experienced less driver related problems with the Radeons than with the GeForces.
Lesson: Don’t buy a high-end
Lesson: Don’t buy a high-end GPU with a reference cooler.
“Don’t buy a high-end GPU
“Don’t buy a high-end GPU with a reference cooler.”
I would say, “Don’t buy a high-end GPU with a reference cooler.” from AMD.
Nvidia Titan class coolers are great in build quality and performance, you can OC right out of the box. The new GTX780ti sounds like a good OCer with its stock cooler.
Water cooling is not a option, it would open a huge can of worms for me, closed loop cooling for CPUs are great but not for GPUs, warranties would be voided. You would be surprised how many are returned because of leaks.
It’s not that Titan or GTX780
It’s not that Titan or GTX780 coolers are good.. Its the architecture that is more efficient in using up electricity without leaking it away.
I beg to differ, those Nvidia
I beg to differ, those Nvidia GPUs can draw a lot of power too. But their reference coolers ARE superior to AMD’s reference coolers. Thats easy to see.
they are better but i would
they are better but i would bet that hawaii gpu makes more heat them nvidia counter parts.
I intentionally bought the
I intentionally bought the reference card, so that i can slap a good ol’ waterblock, in hope that the elements would be of higher quality. Anyone willing to use the stock leaf-blowers and have their cards running at 95c should be very aware of heat-related issues. Better after-market cooling at about the same price would mean the use of cheaper electrical parts. Guess anyone willing to get a 290(x) should consider the cooling options available, since a simple hair or dust bunny, stuck in the fan can ultimately destroy the equipment.
For me, personally, any piece of hardware running on more that that 20c over the room temperature is a potential risk, so i invested in a massive water cooling kit. All you need to do then is to get a new waterblock for the new card every 2 years or so. Also the noise is almost non-existent.
Enthusiasts know what they
Enthusiasts know what they are buying from the first day this card was reviewed on the Internet.
People who don’t know about specs and only buy the best card no matter the cost don’t care about clock speed, only about performance. Is it really a problem if the card drops down to 100MHz but gives performance numbers equal or better than GTX 780? Really?
I understand the articles about frame rating and I supported PCRer 100%, but now you just try to find ways to make the card look bad and faulty.
If performance is there and IT IS, if the default mode is the quiet mode and IT IS, the mode that the card was presented and reviewed, then the buyer can unlock more performance with a better air cooler or watercooling.
The only case where you are right is if someone takes this card and installs it in the smallest, worst ventilated case ever created. Also it could be a problem if he lives in Sahara desert…..
It is more like we are trying
It is more like we are trying to make two points:
1) Show people what the actual performance is (because it can dip like 10-15% at default fan settings after just a couple of minutes). What if they based their decision on reading a website who runs a 30-second benchmark without allowing the card to warm up? They're going to get a significantly different experience 10 minutes in to their gaming session unless they manually throttle-up their fans.
2) They should not say "up to" in their marketing material. This could be particularly bad if other companies (beyond AMD, beyond a less-than-reputible partner, beyond GPUs even) take this as an open invitation to advertise nothing except what their card (or whatever) can do for 30 seconds strapped to a leaf blower outside in a snowbank. Personally, I believe the system Ryan developed would be much better numbers "on the box": the card obviously ignores minimum fan speed to maintain 727MHz (so that should be its base) but is comfortable at ~850-880MHz with default settings. I think this metric should be adopted by AMD.
But you do highlight our challenge: if you use headphones or you purchase a custom cooler? We have seen settings where AMD's card can run stably at 1 GHz; but, it is not default… or does the switch give it two defaults? Does that even make sense to say?
Difficult for us to describe? Difficult for our readers to fully understand. That means we needed to try harder and find a better way to explain it.
Thanks for your comments!
Great article Ryan and some
Great article Ryan and some good points.
I think an advantage that AMD has is how precise Power Tune is at adjusting the 290x, that it is very configurable and the cost.
When I see the clock history in Afterburner being adjusted with extreme frequency and precision, it shows that, right now, the R9 has the best control over its GPU.
When you pass more control over to the user, it alleviates many concerns over temperature and noise. However, people who value power and temperature got a great treat because the 780 price got dropped much lower.
Overall, I think because of these factors the AMD launch was great for the consumer. That is the most important part of the wide acceptance and the difference the Fermi launch some years ago.
first bios switch is called
first bios switch is called quiet…. i wonder why… maybe to run quiet? and second bios switch should be called up to 1000mhz cause seems like it sticks to 1000mhz most of the time.
Mantle …Star citizen can’t wait!
I don’t consider the fan at
I don't consider the fan at 40% on this cooler to be quiet.
Your load on quiet for the
Your load on quiet for the 290x is 42.6 (which in turn is nowhere near the 35.6 db from titan /780 from your own review i know )
40 db is the equivalent of rain drops.
Not quiet compared to nvidia? i’ll give u that.
amd still managed to make it 5db lower on uber compared to the older gtx 680, amd 7950, amd 7970 , older model blower type fans. (which was 55db)
we just need 3rd parties to make the noise/heat moot.
Raw decibels are good for
Raw decibels are good for quantifying sound (and relatively easy to measure and graph) but lousy for qualifying noise. Our ears have drastically different hearing profiles dependent on frequency and, even then, some noises are more annoying than others.
Personally, I hope we can figure out a better method (at least weigh by the frequency response curve of an average young adult?) of conveying noise at some point.
isn’t there a special scale
isn’t there a special scale for sound pressure level metering that considers the ear of an average human? I believe there is. However, that will not tell the whole story either… some sounds can be annoying at 35dBA and some aren’t distracting at 55dBA…
There is, it’s called the
There is, it’s called the weighting. You see that “A” at the end of dbA? That’s a db value with A weighting.
I put that “A” on the end on
I put that “A” on the end on purpose… 😉 Like I said “I believe there is” 😉
While the sound of rain draps
While the sound of rain draps or low urban settings (another quote I have seen for 40 db) doesn't sound like much, just consider that this is setting on your desk or under it, the entire time you are gaming. I think noise levels is something that more people care about than they realize.
Perhaps a comparison at 80
Perhaps a comparison at 80 degrees would be a good one.
Nvidia Titan and 780 at 80 degrees vs 290X at 80 degrees.
Have not seen a performance comparison at 80C anywhere.
While interesting from a
While interesting from a scientific view, it wouldn't really be fair to AMD. They made the decision to allow 95C temps, so we'll go with that for most testing.
How about we do the 95C
How about we do the 95C comparison. Oh wait, Nvidia’s GPU wont live at that temp. Thats a good thing from AMD, its a robust design that will benefit overclockers that use good cooling solutions.
you are assuming that ref
you are assuming that ref cooler is just that bad and amd gpu’s don’t just make that much heat. i bet nvidia gpu could handle 90c+ just they know people don’t want to turn their computer case in to an oven.
I would like to congratulate
I would like to congratulate you.
Again, you hit the target from 12.
Prior to the FCAT problem. Now the problem of the temperature.
Thanks for the facts put out.
Thanks for the overall
Thanks for the overall article and I believe that the overall point is spot on. An operating clock speed range would be more honest than the “Up To” advertising.
Luckily most reviews that I have read have recognized this and run their tests while the card was already hot.
One thing I would like to see is examples of what sound levels are like. Reading “40.8” dB or “53.4” dB does not let me know what it sounds like.
Looking at a handy chart from Industrial Noise Control (http://www.industrialnoisecontrol.com/comparative-noise-examples.htm) helps put this in perspective. Forty dB is like a “Library, bird calls (44 dB)); lowest limit of urban ambient sound” and 50 dB is “Quiet suburb, conversation at home. Large electrical transformers at 100 ft”. They describe 60 dB as a restaurant conversation. This puts the noise levels into perspective and gives a better idea of what they compare to.
It appears that this is one card to wait on the custom cooler designs and see who comes up with the best design.
We might be able to do some
We might be able to do some recordings. Thanks for the idea!
Custom coolers, right
And all
Custom coolers, right
And all that heat into the case, right
Are they any custom coolers
Are they any custom coolers on any of the high end cards that don’t dump the heat into the case?
Which “Geforce GTX card” is
Which “Geforce GTX card” is used? Where is it’s performance variances data?
“During the Run 1 results,
“During the Run 1 results, both GPUs are able to maintain near-maximum clock speeds, as you would expected based on previous data. The Run 2 graph shows the same 40% fan speed we results for the 290X on the previous pages but it also shows how the GeForce GTX GPU reacts. The NVIDIA card also drops in clock speed (from about 1006 MHz to 966 MHz, 4%) but it maintains that level throughout. That frequency is also above the advertised base clock.”
Double standard.
He seams to have a gripe with AMD. Its perfectly fine for Nvidia to drop MHZ from base to boost but not okay if AMD drops from “up to 1ghz”.
Not AMDs fault if you fail to comprehend it.
Hes also comparing the 40% Silent Mode to Nvidia. Hes glancing over the fact that in Uber Mode the drop is almost non-existent since the fan will be at 55%.
Probably since Nvidia doesn’t have a Uber Mode switch yet he felt the need to not compare it.
What it really says is that his benchmarking method has been faulty since its only a benchmark run and not a game environment test. Nothing common sense wouldn’t have solved.
P.S.
Thanks followers for helping us make you dumber.
40% is the default, out of
40% is the default, out of box setting, that's why it was test. We clearly show the drop in clocks for NVIDIA as well, but its only showing a 5% drop, compared to the 20-35% drop for the AMD card here. Again, at the default setting.
Also, NVIDIA clearly advertises the clock variance while AMD does not.
not the same Anonymous.
I
not the same Anonymous.
I asked which nvidia card was used and if you have any nvidia’s performance data related to that run.
Pathetic
Pathetic explanation.
“default setting”
You tested none default settings at 20, 30, 40, 50 & 60.
At 50% the graph shows it returns to 1ghz unlike the Nvidia card which dip 20-35% and stay there.
Uber Mode is a simple flip of the switch which will give you 55% fan speed with almost no clock drops. After all these graphic cards don’t install themselves.
Unless these buyers are utterly clueless like you seam to indicate your readers are. I might be incline to agree with you.
Im not sure what statements
Im not sure what statements you disagree with. Does amd advertise base clocks and boost clocks and give the buyer an idea of what to expect?
Does nvidia not advertise base clocks and boost clocks as Ryan stated? What are you disagreeing with?
“You tested none default settings at 20, 30, 40, 50 & 60.”
yeah, he tested these out of scientific curiosity. They show some interesting results. They are not, strictly, necessary for the keypoints of this article.
“Uber Mode is a simple flip of the switch which will give you 55% fan speed with almost no clock drops. After all these graphic cards don’t install themselves.”
Sure, but if everyone should switch to 55% why does the card even ship at 40% default? Maybe you should contact amd about this problem.
Are you asking why Ryan didn’t have the same scientific curiousity when he compared to Nvidia? I don’t know, why don’t you simply nicely ask him to test at 55% against nvidia?
It’s not like he’s posting anti amd articles all day long. He already gave the pcper gold award to the r9 290x and showed that the crossfire results were very good.
So we can assume that a good
So we can assume that a good custom cooler will keep the card at or close 1Ghz all the time? Plus overclocking headroom.
I can smell a little fanboy
I can smell a little fanboy reviewer. How much did Nvidia pay you for this?
Like, 100 million dollars
Like, 100 million dollars man.
Wanna see my gold Lambo?
It not a gold Lambo.
Its has
It not a gold Lambo.
Its has to have room for him and Tom Peterson to continue there Bromance affair.
I’m thinking a spacious minivan.
Ryan, honestly now who is
Ryan, honestly now who is making you write these articles Nvidia or AMD, or both. It might be just me but I have a feeling you are getting ready some Nvidia reply to AMD reviews as we talk you are probably working on it. I get it, AMD use to sponsor your site then they didn’t then Tomy showed up in your office then Nvidia sponsored you (or maybe the other way around). Not only do you probably own your self a Nvidia card at home (because the wife probably doesn’t mind the noise that it doesn’t make & she would probably divorce or kill you or even worse if you bought an 4K ASUS PQ321Q), so you are limited to some 2560 x 1600 experience on your Apple 30″ witch for Nvidia maybe SLI is great at so far. But once you get into multi monitor stuff & 4K territory witch Nvidia lacks so far, I think I could have a better experience with AMD Radeon B9 290Z maybe in XDMA even better. I think it’s something that SLI was not designed for or is capping somewhere vs. XDMA + the 3Gb cap, you know the more Gb’s the better.
Let’s say I’m divorced at the time & that I can hypothetically afford some crazy things like: 3X R9 290 + 1X R9 290X [(XDMA) I think it might work since AMD is not so ass uptight about having everything the same down to the manufacturer & model of card] http://www.youtube.com/watch?v=3TEYkRwkrg8
Will they get hot? Will I still make heat in my house if I water cool it, for the sake of silence. The water cooling loop from the money I saved from each card vs 4x 780Ti SLI $2,800 vs 3X <$550 + $550= less than $2,200 witch is what I payed for my 4 Yamakasi (3 portrait, 1 landscape center top for browsing & other crap). I would be able to afford a good water loop for the value of $600, sure the extra work & everything else with it. Would you think I would have a better experience @ it for an extreme enthusiast based on your https://pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Radeon-R9-290X-CrossFire-and-4K-Preview-Testing 4k review & recently also http://hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review#.Ung3HSeeaGo these guys Crossfire review. Maybe throw in some 4way raid 0 ssd then raid 1 by 4 other ssd for Allan’s sake.
If I do any of this crazy talk at a point & time in my busy life would you consider posting on your site about it?
I don’t… Who said…
I don't… Who said… What…?
:/
Hope that guy wasn’t one of
Hope that guy wasn't one of the Indiegogo backers that gets to come on the show.
Keep hoping you might be
Keep hoping you might be surprised, I was ironically joking more or less I guess you guys don’t like those kind of jokes,
I like Ryan’s response though he’s playing along & he acts like nothing happened. After all that writing I still like your honesty & pod casts you people do. I am the one pushing you to the edge so you can be a better you or not.
Thanks for the review, what
Thanks for the review, what do you think of Tom’s review and the fact that they bought 2 retail r9-290xs and got drastically different results from the review samples?
Would love for you guys to buy one (a 290x) anonymously and test it out.
ryan did say they bought a
ryan did say they bought a few retail cards to test that. What was on toms hardware, they asked AMD about it and amd said it was a bad card, which is possible since 15-20% drop is pretty bad. It could be very possible it is, what most reviews are open air bench which does help all cards get lower temps but when card like this has no problem getting to 95c, makes you wonder how much card will clock back cause of an in-closed case?
Why don’t you crawl back into
Why don’t you crawl back into your AMD fanboy cave retard & GTFO!
Good review Ryan I wanted to know this as well before buying a R9 290 card. But I will wait for some after-market coolers and reviews of those if the throttling is better. If not I think I might go Nvidia.
I was joking but I am still
I was joking but I am still waiting for the 780Ti reviews also & then I’ll wait some more & so forth but I gotta say I like your Nvy Dia fanboy Reaction. It’s a discussion not a battle but I’ll take you up on both name the time & place & Yes Jeremy I am one of Ryan’s backers and a pretty big one too.
http://www.pugetsystems.com/l
http://www.pugetsystems.com/labs/articles/Gaming-PC-vs-Space-Heater-Efficiency-511/
nice info there ryan
What
nice info there ryan
What nvidia card is that?
If its not the gtx 780 it would be nice if you could add it in?
I would love to know the difference in perf/clock of the hawaii vs gk110 based cards
Maybe you could do a clock for clock review?
Sadly, no review site offers such info 🙁
… Which “Geforce GTX card”
… Which “Geforce GTX card” is used?
Since it wasn’t specifically
Since it wasn’t specifically mentioned it probably is still under NDA. So my guess is the GTX 780 Ti.
Which boost?
boost1 starts
Which boost?
boost1 starts throttling on 70°C, throttles more on 85°C, 90°C and 95°C(step size of 13.5 MHz). And safety clocks at 98°C.
Boost2 throttles only at 80°C and safety clocks at 95°C.
Is there like a 60 sec
Is there like a 60 sec limitation to your benchmark runs that cant be overcome?
Also, what’s the point of those comparisons with the nvidia card without any performance #s? All that shows is how much more dynamic powertune is compared to boost 2.0 :~
There are no performance
There are no performance numbers because seeing the 40% fan speed 290X still beating the 780 was contrary to the desired effect of the article.
Um i doubt its still beating
Um i doubt its still beating the 780. since card is under clocked from its “up to” max. AMD starting to sound like an ISP now. AMD card is crippled by 15-20% almost with that gpu loss, so at that point its “lead” over the 780 is gone.