In the comments to our recent review of the ASUS ROG Swift PG279Q G-Sync monitor, a commenter by the name of Cyclops pointed me in the direction of an interesting quirk that I hadn’t considered before. According to reports, the higher refresh rates of some panels, including the 165Hz option available on this new monitor, can cause power draw to increase by as much as 100 watts on the system itself. While I did say in the review that the larger power brick ASUS provided with it (compared to last year’s PG278Q model) pointed toward higher power requirements for the display itself, I never thought to measure the system.
To setup a quick test I brought the ASUS ROG Swift PG279Q back to its rightful home in front of our graphics test bed, connected an EVGA GeForce GTX 980 Ti (with GPU driver 358.50) and chained both the PC and the monitor up to separate power monitoring devices. While sitting at a Windows 8.1 desktop I cycled the monitor through different refresh rate options and then recorded the power draw from both meters after 60-90 seconds of time to idle out.
The results are much more interesting than I expected! At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.
But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.
Interestingly we did find that the system would repeatedly jump to as much as 200+ watts of idle power draw for 30 seconds at time and then drop back down to the 135-140 watt area for a few minutes. It was repeatable and very measurable.
So, what the hell is going on? A look at GPU-Z clock speeds reveals the source of the power consumption increase.
When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.
Though details are sparse, it seems pretty obvious what is going on here. The pixel clock and the GPU clock are connected through the same domain and are not asynchronous. The GPU needs to maintain a certain pixel clock in order to support the required bandwidth of a particular refresh rate, and based on our testing, the idle clock speed of 135MHz doesn’t give the pixel clock enough throughput to power anything more than a 120Hz refresh rate.
Pushing refresh rates of 144Hz and higher causes a surprsing increase in power draw
The obvious question here though is why NVIDIA would need to go all the way up to 885MHz in order to support the jump from 120Hz to 144Hz refresh rates. It seems quite extreme and the increased power draw is significant, causing the fans on the EVGA GTX 980 Ti to spin up even while sitting idle at the Windows desktop. NVIDIA is aware of the complication, though it appears that a fix won’t really be in order until an architectural shift is made down the road. With the ability to redesign the clock domains available to them, NVIDIA could design the pixel and GPU clock to be completely asynchronous, increasing one without affecting the other. It’s not a simple process though, especially in a processor this complex. We have seen Intel and AMD correctly and effectively separate clocks in recent years on newer CPU designs.
What happens to a modern AMD GPU like the R9 Fury with a similar test? To find out we connected our same GPU test bed to the ASUS MG279Q, a FreeSync enabled monitor capable of 144 Hz refresh rates, and swapped the GTX 980 Ti for an ASUS R9 Fury STRIX.
The AMD Fury does not demonstrate the same phenomenon that the GTX 980 Ti does when running at high refresh rates. The Fiji GPU runs at the same static 300MHz clock rate at 60Hz, 120Hz and 144Hz and the power draw on the system only inches up by 2 watts or so. I wasn't able to test 165Hz refresh rates on the AMD setup so it is possible that at that threshold the AMD graphics card would behave differently. It's also true that the NVIDIA Maxwell GPU is running at less than half the clock rate of AMD Fiji in this idle state, and that may account for difference in pixel clocks we are seeing. Still, the NVIDIA platform draws slightly more power at idle than the AMD platform, so advantage AMD here.
For today, know that if you choose to use a 144Hz or even a 165Hz refresh rate on your NVIDIA GeForce GPU you are going to be drawing a bit more power and will be less efficient than expected even just sitting in Windows. I would bet that most gamers willing to buy high end display hardware capable of those speeds won’t be overly concerned with 50-60 watts of additional power draw, but it’s an interesting data point for us to track going forward and to compare AMD and NVIDIA hardware in the future.
Is 165hz supported without
Is 165hz supported without Gync (with AMD) on the PG279Q? Maybe you could test that as well.
These results are amusing as GCN is the older architecture of the two. It seems to be aging rather well
As far as we can tell, you
As far as we can tell, you can ONLY enable 165Hz when connected to an NVIDIA Maxwell GPU.
this is not true! since the
this is not true! since the induction of 120hz and higher this has been the case. try running 120hz+ and 2nd 60hz monitor on kepler.
I’ve been running like this since 2012 with 120hz monitors. the higher the HZ the more power is need to push the refresh rate.
This isn’t news?
It might _seem_ to be ageing
It might _seem_ to be ageing well. In truth, the entire line of 15.x drivers from AMD is so broken my 290X can’t reliably display to two 1920×1200 monitors while staying in its low-power (300/150 MHz) state.
Of course, AMD just stays in that state anyway, resulting in visual anomalies across the entire desktop. But hey, I’m sure the energy consumption is staying low.
:/
Been using the 290 since
Been using the 290 since early 2014 and have literally no idea what you’re talking about. Drivers work great and have been gotten a lot better lately.
That’s great … for you! I
That’s great … for you! I can’t use any driver beyond 14.12 because they all eventually show the same jumping lines in my 290X’s low power 300-315 MHz state.
Unless of course I get the card to clock up slightly (379 MHz is enough) by such simple things as scrolling a Chrome window or playing a Youtube video.
I’m sure there are other influencing factors beyond number of displays, total desktop size, software used and so on, but the fact that the 14.x branch didn’t show any problems, but the entire 15.x branch is broken and has so far not been fixed doesn’t instil confidence in AMD’s driver efforts.
.
PS: I wonder where you’ve found recent AMD drivers improving things, cause I haven’t found them to add anything meaningful since VSR.
Correct
Correct
Yet on the other hand, my GPU
Yet on the other hand, my GPU won’t even display to two monitors without flickering off and on with current drivers. There are three 27″ monitors on my desk, and I can use exactly one of them. Great work.
I’ve found that there can be
I’ve found that there can be a problem with the idle 2D clock speeds on the 290 series being capped at 100mz or so. Try using gpu tweak to change the 2D clock to 300mz, that may solve your problems.
Maybe you should try the
Maybe you should try the Nvidia drivers with your GTX card. Just a suggestion.
Lets try and not confuse our
Lets try and not confuse our respective use of graphics hardware.
I’m pretty sure it was YOU that was so concerned about Geforce drivers that you insisted on a formal inquiry by PCPer.
The only think I expect from
The only think I expect from Nvidia is to finally offer support for DX12 on Fermi cards as promised.
Now I am pretty sure you think you remember every post I have written on PCPer when it is convenient.
Wait, you own a Fermi as
Wait, you own a Fermi as well? Quite the Geforce collection you have going on there, no wonder you think Nvidia drivers are the solution for AMD’s broken 15.x branch!
Fermi WDDM 2.0 support
Fermi WDDM 2.0 support apparently appeared in the 358.70 VR developer driver, so I’d expect it to show up one of the next official releases soon.
Someone in 3DCenter ran the Futuremark overhead test on a GTX 470 (modded to Quadro) already.
Youre clearly a nvidia
Youre clearly a nvidia fanboy, so why you have AMD card isnt clear for this moment. I think yiou are imaging stuff or give link to your card with your name on it or gtfo fanboy. btw AMD have superior drivers and hardware for the past year
I doubt I’m qualified for
I doubt I’m qualified for that questionable honor having only used AMD GPUs in my gaming rig for the past 3 1/2 years, but I’m sure you can explain your rational.
Then again, I did buy both a GF4 Ti 4600 and a GTX 8800 when they came out, so I suppose it could be a latent infection only waiting to awake whenever an Nvidia card comes too close. 😀
If you’re getting visual
If you’re getting visual anomalies from your GPU, one of the following is true:
1) You’ve got the card overclocked too much. Turn down your overclock to default speeds and check again for anomalies.
2) Your GPU is defective. You should RMA it.
3) You have an Adobe Flash youtube window open in your web browser while you’re also trying to run another GPU intensive operation. Your version of flash is probably out of date, or your graphics driver is out of date, or both.
Not true on any of the
Not true on any of the above.
And as I said, the system works fine with the 14.x branch of Catalyst and the anomalies disappear with minimal upclocking of the GPU core. (300 -> ~380 MHz, which can for example be activated by scrolling a webpage in Chrome)
Is there a setting in the
Is there a setting in the monitor to keep it at 120hz on the desktop, and let the refresh rate completely free to go up with g-sync?
Yeah, if you set the refresh
Yeah, if you set the refresh rate to 120hz in the monitor so it’s always 120 on the desktop, you can go into Global Settings in the Nvidia Control Panel and set the refresh rate in games to the max available. It’s almost like a boost clock for your monitor, and you hit 144/165hz only when the GPU would be under load during gaming.
Perfect, thanks. I wasn’t
Perfect, thanks. I wasn’t sure as I’m still stuck at 60hz, but was planning to upgrade to the pg279q soon!
It looks like this is the
It looks like this is the default behavior actually. Just set to 120 at the desktop and games should run at max-panel-capable refresh. I'm going to run at 120 for a bit to see if it's slow enough to feel like an actual disadvantage on the desktop.
Duh? this is common knowledge
Duh? this is common knowledge since 120hz came to the scene. Try runninng 2 monitors. 1 60hz and 144hz Gsync. Look at the idle clock and the Gcore. Major differnces going on there.
I don’t think it has anything
I don't think it has anything to do with G-Sync, honestly. But the dual monitor concern seems to make sense. Combining two 60Hz panels *should* be fine as it only requires the same pixel clock and bandwidth as a 120Hz single display. If you have a 75Hz and a 60Hz for example though, I would wager the GPU clock would increase.
I actually run two monitors,
I actually run two monitors, one of which is a 144Hz panel, the other is 60Hz. It’ll only clock up when I put the faster panel at 144Hz. Running 120Hz on my faster monitor and 60Hz on the secondary works without clocking up. Probably a bit higher power draw, but no clock up.
it doesn’t.
it has to do with
it doesn’t.
it has to do with higher refresh rate and multiple monitors. I’ve confirmed this with ManualG at NV.
This leads to question what
This leads to question what would triple 144Hz monitors do for both nVidia and AMD (even though it is atm rather theoretical discussion, but 120Hz 4K wouldn’t be)
yep this is well known to
yep this is well known to those of us who constantly monitor everything and do not have to keep changing our systems to test new things. This is akin to running CF with amd and having base clocks jacked. I only overclock my qnix 1440p moitor to 90 Hz just for this reason. I could hit 120Hz but it is not worth it for me.
I run a gtx 970 now and am coming from a CF 7950.
Hmmm that’s strang…I have
Hmmm that’s strang…I have SLI 980 TI’s and a Acer XB270H Abprz + Windows 10, latest drivers…Desktop refresh at 144 and my GPU’s stay clocked at 135MHz and do occasionally go higher if I’m doing a lot on the system, but go right back down, and stay down at 135MHz at idle on the desktop…What do you guys have the power management mode set at? Adaptive or Maximum Performance Mode?
Same behavior either way.
Same behavior either way.
Interesting. Are you sure
Interesting. Are you sure you're looking at both GPUs? I ask because my 680s only clock up to 550 at >120Hz, and only do that on the GPU driving the display. It seems that there are fewer low-speed states on Maxwell.
Yep…no problems running
Yep…no problems running 144Hz, SLI 980 TI’s, they stay at 135MHz both GPU’s…they do ramp up briefly if I’m doing a lot of stuff on the desktop (ie playing a video, opening a browser, etc) but they go back to 135MHz after a few seconds…if you would like more info let me know, I’m a system builder and build many of these kind of systems….
You have 1080p monitor which
You have 1080p monitor which can run just fine with idle clocks @ 144hz. Now try to do the same with 1440p monitor. You have less pixels to push.
1080p = 144hz = idle clocks
1440p = 144hz = not so idle clocks.
And of course the increase in
And of course the increase in watts drawn increased the GPU “idle” temp proportionately. A Titan X on air will sit at 57 degrees C at 144 Hz on the desktop, and drop down to 36 or so at 60 Hz.
That’s significant as well,
That's significant as well, no doubt.
Its crazy how many things
Its crazy how many things nvidia (970 3.5GB for one) get away with vs AMD. I think review sites need to put their trust in nvidia to the side and treat them the same like AMD.
This is a big thing sites not testing power consumption properly on higher rez systems. I remember people saying to test the GSYNC higher rez monitors with module, and we were told it wont make any difference for the past year.
agreed, reviewers and users
agreed, reviewers and users both hold AMD to higher expectations than they do nvidia…it’s a ridiculous double standar
I wonder what will happen
I wonder what will happen when I use this with GTX 670.
It idles at 324 Mhz but the bandwidth etc is of course lower than a gtx 980 ti..
Ryan … SO what kind of chip
Ryan … SO what kind of chip layout is going on inside the new pg279q?
IS it a quad core or how is the monitor Jumping that far for power for 20 more hz? Thats very very inefficient
You are not paying attention,
You are not paying attention, the SYSTEM is drawing more power, not the monitor.
“chip layout”, “quad core in a monitor” what are you smoking?
Would Apple consider AMD for
Would Apple consider AMD for not adding more heat and power draw to their design even if Nvidia has a pretty good offering at the moment?
Would it matter if you changed motherboards?
I am thinking it would not.
Major fail.
Ryan, can’t the
Major fail.
Ryan, can’t the Nvidia driver just drop the refresh rate at 120Hz when NOT running 3D or, if it is possible, when detecting that the system is idle or in the desktop? Or would this be bad for the monitor? I think they can come up with some trick in their drivers for this.
Any comments from Nvidia fanboys like chizow or Allyn?
PS RYAN THE BROWSER IS REPORTING PCPer AS AN ATTACK SITE AND BLOCKS IT.
For the past couple of days
For the past couple of days some adds linked by PCPer get detected as viruses.
Its try’n to install GeForce Experience.
Still same thing Virus detected BLOCKED!!!
We know about the issue and
We know about the issue and have fixed/patched. Google just takes a bit to catch up. 🙂
It looks like ‘Preferred
It looks like 'Preferred refresh rate' in NV control panel defaults to 'Highest available', so those concerned about idle / desktop power draw can simply drop their desktop refresh rate to 120 and launching a game will auto-switch to 144 / 165. Simple workaround, but it is certainly an issue worth noting for those more concerned about power draw / heat generation.
An additional note for SLI setups is that only the GPU driving the >120Hz refresh display clocks up. Also, older GPUs like the 680s I have here at the house only clock up to 550 at >120Hz. It seems that there are fewer low-speed states on Maxwell, probably in favor of more control at the upper end.
Regarding fanboyism – I'm a fan of (and spend my own money on) the tech I believe to be better. For me a bit of extra idle power draw is not that big of a deal, but if it was, I'd just drop to 120 at the desktop. Heck I may just drop it there anyway to see if it's slow enough to feel like an actual disadvantage.
Dropping to 120Hz is a major
Dropping to 120Hz is a major issue. We should do a new article overanalyzing and blame Nvidia for that, like back then with the 90Hz issue and that ASUS monitor. What do you say?
I am only kidding. This does look like a simple workaround. To be honest, I could live with that. Easily. I think Nvidia should have already put a 120Hz limit in their drivers, so people never face that issue, even those who don’t spent time online reading hardware sites. Not doing so, probably for marketing reasons, is not the correct thing to do.
And yes for many of them, power draw IS a concern, considering that power draw became suddenly as important as performance itself, when Maxwell cards came out. But I guess power draw is a concern only when testing AMD hardware or compare it with Nvidia.
As for your fanboysm, if 60W to 130W more with added heat and noise AT IDLE is “a bit of extra”, well, your fanboysm just skyrocketed itself in orbit around Pluto.
I’ve gone over a year not
I've gone over a year not even noticing this on my home PC, and we hadn't noticed this at the office when testing high refresh rate displays, either. It's not like the fans spin up to max. The NV parts are still drawing less at max load, which would be just as much of a concern to power-saving folks. One could argue that active draw is *more* important, because those power saving folks are likely putting their systems into standby / off when not in use, making the higher idle / desktop draw when at 144 a moot issue.
It's not anywhere near 120W, by the way. Don't let the fanboy in you amp up those numbers 🙂
For someone who uses a PC
For someone who uses a PC ONLY for gaming, like a console, idle power draw probably doesn’t matter. That is 1% maybe? 5%? The rest of the world uses PCs also for movies, browsing, running programs or just doing some productivity stuff that doesn’t need from the GPU to clock high. And you can’t watch a movie while having your PC at standby/off, right? A “moot issue”. LOL?
Please try NOT to downplay the obvious, because it concerns NVidia. There are plenty of fanboys here to try to make black look like white. It’s not necessary a top author of PCPer to give credibility to a completely false perspective. It’s bad for the site.
As for your experiences and those 120W numbers, probably you don’t read your own site’s articles. That 130W is the peak wattage difference based ON YOUR NUMBERS and we are talking about something that Ryan VERIFIED HIMSELF, not a rumor floating around. So, call me as many times a fanboy as you want. My avatar even helps in that direction. You only expose yourself, not me.
I don’t use my home PC
I don't use my home PC primarily for gaming. I use it primarily for writing. At the desktop, and yes as it has been discovered, drawing more power when at 144 Hz.
Ryan reported the highest number in honesty, and we were unsure if something else in the system might have been ramping up the GPU a bit further, likely assuming people wouldn't lock onto it and take it as gospel. Apparently we were mistaken (in your case).
I've got a few hours of testing on this issue now, and have only seen the system sit at ~76W at 120 and under, and ~138 at 144 and higher. I'm not sure what caused the higher figure in the prior testing, but given that it was intermittent, it would have had nothing to do with the power level required to drive at the higher refresh rate anyway. An intermittent case happening for 30 seconds every couple of minutes can't possibly have anything to do with refreshing a given display every 6ms. If the higher (200W) total power was needed to drive the display at 144+, then what magic did it use to pull it off for the 90 second in-between periods where the system sat at 138W, refreshing the display 13,000 – 15,000 times? Permission granted to apply common sense to your interpretation of what has been reported.
I am completely objective
I am completely objective here even pointing out that probably a newer driver would fix the problem, or mentioning you in another post saying that there is an easy workaround.
But it seems that you have serious problems with my postings, probably because I am trying to convince you all this time to lock in a dark room the fanboy in you.
Your attitude is so ridiculous and blind that when in an older article I informed you about a problem in an article of yours with some images not displaying you stayed silent. After the article was fixed you posted that you couldn’t find any problems.
As for the +130Ws, I didn’t pointed specifically at that number. But it seems that you are having huge problem’s with Ryan’s honesty. Judging by your own post, if you where the author and had any indication that some people might notice that spike in power consumption and consider it important, you would probably have second thoughts about posting that information. Nice.
As for the spike specifically, 30 seconds in 2 minutes is not something to ignore. ARE YOU SERIOUS? Don’t write these stuff in public. Ask Ryan if you have a problem understanding me. 5W at idle can have a huge impact in the way someone is judging a GPU, the final conclusion about a GPU. 70Ws is huge. 130W, especially for 1/4 of the time the system is on, is a serious problem. It’s not another “good design”. And no one points to the monitor or the GPU. Probably a bug that Nvidia will fix in a latter driver, or maybe Windows are doing something that in a normal situation doesn’t have any significant impact in the system or the GPU frequency, but in this case works like the drop that overflows the glass.
Anyway, have a nice day. No reason to keep posting here. I hope you find the problem and have a nice exclusive in the future.
There are other editors on
There are other editors on this site that can correct issues when they see them. No I didn't go edit a post and then reply to a comment saying there was no issue. Again, common sense, please.
If you come into the comments goading an editor ("Any comments from Nvidia fanboys like chizow or Allyn?"), then prepare to be smacked down.
We are continuing to look into the power issue.
Have you contacted the
Have you contacted the wizzard from techpowerup to confirm your results?
He does extensive testing of power usage at idle and multiple monitor setup.
lol@ Fanboy JohnGR.
70-130W?
lol@ Fanboy JohnGR.
70-130W? Welcome to AMD all the time. Did they ever fix full power state driving more than a single monitor? I bet they didn’t, it sure wasn’t fixed the last time I had an AMD card just in February with 290X. Scanning various google hits, still not fixed it seems. 🙂
But I guess you wouldn’t know about any of this as its becoming more and more obvious you are in fact, just an AMD fanboy as you don’t even buy the products you fanboy over.
Go get them Chizow! 🙂
AMD
Go get them Chizow! 🙂
AMD Fanboys for the lose!
Oh god. Not you. Not
Oh god. Not you. Not here.
Still got your tongue up chizow’s arse, I see.
it’s the WCCFT re-union
it’s the WCCFT re-union
What’s especially funny is
What’s especially funny is that Ryan and especially Allyn aren’t afraid to mix it up with these AMD fanboys lol, its truly hilarious when they get told with both wit and science. 🙂
Typical Nvidia fanboy
Typical Nvidia fanboy keyboard warrior wannabe.
Nvidia card uses 60W less than comparable AMD card under load = NVIDIA WIN! AMD STILL SUCKS!
Nvidia card uses 60W MORE than comparable AMD card at idle = COMPLETELY INSIGNIFICANT OH AND LOOK AT THIS OTHER THING I SAY AMD DOES BADLY!
Go back to WCCF, loser.
Hahah yeah another Anonymous
Hahah yeah another Anonymous AMD fanboy keyboard warrior.
Its truly amazing, where was all this pent up fake angst and anger for the YEARS AMD has had broken power tune in multi-monitor configs on their cards? No problem at all, but suddenly, an issue with Nvidia products none of these AMD fanboys could ever dream of using and owning is the biggest deal ever!
It’s OK though, I’m always happy when bugs like this are discovered, because I’m confident they will be fixed for my benefit by shedding light on them and finding the root of the problem. Unlike AMD fanboys, always sweep everything under the rug, deny deny deny, hope it goes away! And what do you end up with? Junk products, but what else do you expect when AMD’s most devout followers don’t expect more or won’t be critical of their products! 😀
“but suddenly, an issue with
“but suddenly, an issue with Nvidia products none of these AMD fanboys could ever dream of using and owning is the biggest deal ever!”
Wow, talk about hypocrisy..you run your mouth all day, everyday about AMD products you never dream of owning, and the sexond AMD does anything that’s wrong in your opinion, you jump all over them…what’s wrong? you can’t take what give out all the time like a little baby? Talk about a walking double standard…btw, where do you get all the free time to spread your cancerous comments to all these tech sites? shouldn’t you be working an extra shift at Domino’s to save up for pascal?
Well, speaking of junk
Well, speaking of junk products, chizow, the last nVidia card I bought, a GTX670, which cost me $420 Canadian and only had 2GB of ram, couldn’t even drive my 3 monitor setup without going through a 19 step procedure, which involved manually adjusting the vertical refresh rate of monitors 2 and 3, and plugging in the 2nd and 3rd monitors in a specific sequence. And every time there was an nVidia driver update, I had to go through the same goddam 19 step procedure to get all three LG monitors to come up with a picture. I got so annoyed with the mickey mouse nVidia drivers that I sold the card for $380 to some nVidiot and bought a Radeon 7950 for $319 (with 3GB of ram), flashed the bios with a 7970 bios to push it up to 1050MHz, which it did flawlessly, and got all 3 monitors to come up perfectly in eyefinity within 5 minutes of installing the supposedly ‘inferior’ AMD driver. Never had a problem with that 3 monitor setup since, which is now my girlfriend’s setup.
So much for paying more to get ‘more’. More like you pay more to get LESS with nVidia.
see that? AMD had a bad rep
see that? AMD had a bad rep for the dual monitor thing. People won’t forget it. But nvidia causing users to use a ton more power because pcs are more often idling or even running games that don’t need that much power, than playing full blast.
Its just the fact nvidia does get away with much more and are still preferred even when their hardware is weaker. Gets annoying to see.
see that? AMD had a bad rep
see that? AMD had a bad rep for the dual monitor thing. People won’t forget it. But nvidia causing users to use a ton more power because pcs are more often idling or even running games that don’t need that much power, than playing full blast.
Its just the fact nvidia does get away with much more and are still preferred even when their hardware is weaker. Gets annoying to see.
“One could argue that active
“One could argue that active draw is *more* important, because those power saving folks are likely putting their systems into standby / off when not in use, making the higher idle / desktop draw when at 144 a moot issue.”
Nonsense. The idle power draw is more important, because you’re only gaming for a fraction of the time in a day on average that you’re browsing the web, using word, etc. Your computer will be in low power/idle mode FAR more often than it will be under load, unless, of course, you’re running F@H or something like that.
A bit of extra power? But its
A bit of extra power? But its not a bit of extra power as you call it is it? I mean wasn’t you the one stating power efficiency (repeat again) with every video/review? And am pretty sure you purchased your nvidia card (970 I think) based on power efficiency right… so what happened now eh ROFLMAO
At idle (over 60% of the time) we knew AMD are/were superior over Nvidia power wise, but most just ignore the facts including you guys and still state nvidia better bcoz it has 40W less on average 60hz gaming power usage.
“Regarding fanboyism – I’m a fan of (and spend my own money on) the tech I believe to be better”
Really hows your Asynchronous Compute going on for you ROFLMAO
How’s asynchronous compute
How’s asynchronous compute working for you. I’ve heard that it can cause a 390x to do some an extra 100 watts of power to use it to get at most 10% more frames. Not worth it. That could be the next big article compare power consumption of AMD and Nvidia cards with and without asynchronous compute on. I’m just dreaming no site would want to blacken AMD’s eye.
OMG, Chizow is an annoying
OMG, Chizow is an annoying pest at WCCFTech…I guess he’s like a disease that has spread to many tech sites
Yes, chizow is a paid nVidia
Yes, chizow is a paid nVidia damage control freak. No two ways about it. He’s an elitist, foaming-at-the mouth zealot.
Agree with JohnGR, a simple
Agree with JohnGR, a simple driver fix *should* be enough
Looks like none is necessary.
Looks like none is necessary. The driver default is to run at max in games, so just dropping the desktop to 120Hz is an effective workaround.
Nice, I will have to check
Nice, I will have to check this out, I did notice my main card is typically at ~800MHz and the 2nd card will drop to 135ish but never thought too much of it because my total system only pulled 200-210W from the wall during desktop/idle usage. But its also a heavily loaded system with X99/6-core at 4.5GHz, 2xTitanX, bunch of SSDs, case fans, 3 pumps etc.
Good look as usual though thanks!
A driver fix is unlikely.
A driver fix is unlikely. Nvidia cannot take it upon themselves to drop people’s refresh rates to 120 when they paid a premium for a 144hz monitor. Eating the bug knowing full well they have more leeway than the competition to mess up is a better decision than users finding their refresh rates gimped.
I know a few people who have
I know a few people who have reported this issue for a year or so. But it hasn’t been fixed, this article may be the push but if not it shouldn’t be expected
Isn’t 4k@60hz equivalent to
Isn’t 4k@60hz equivalent to 1080p@240hz in terms of total pixel count per second?
Does this mean 4k@60hz will cause the nvidia gpu to spike to 800mhz?
Was wondering the same thing
Was wondering the same thing actually. Do you guys have a 4k monitor laying around to test, or maybe even 2 or 3? The fix with changing the “Prefered refresh rate” would not work in this situation so it should still be an issue that a driver update would not be able to fix if I understand everything correctly.
Would also be nice to see what happens to AMD@4k on Fiji on older GCN cards. Maybe this is one of the reasons why Apple went full AMD. Apple uses a lot of high resolution displays on their latest products.
“Maybe this is one of the
“Maybe this is one of the reasons why Apple went full AMD. Apple uses a lot of high resolution displays on their latest products.”
Bingo.
I’ve got an MST 4k monitor
I’ve got an MST 4k monitor and a 1920×1200 display connected to a 980Ti (it’s an MSI Gaming model with the fans not spinning at idle). GPU-Z reports 135MHz on the core and 202MHz on the memory clock, no matter which of the displays is turned on (or even if both are). Both displays run at 60Hz. The GPU temperature at idle is around 35 degrees Celsius with the 1200p display turned on only, but goes above 50 with the 4k display turned on (and above 53 with both displays connected). Manually forcing the fans to spin (at their minimum RPM) brings the temperature down below 40 degrees.
p.s.
As for power
p.s.
As for power consumption: GPU-Z states it’s at 11.5% TDP with both displays connected.
I don’t see any
I don’t see any problem.
Given the relative time a GPU spend playing compared to idling, it’s much more important to constantly remind ppl that AMD cards can use somee more watts when running loaded.
Excluding those who build PCs
Excluding those who build PCs ONLY for gaming – power ON, gaming, Power OFF – IN EVERY OTHER CASE, PCs spent much more time idling than running stuff that forces the GPUs to run full speed. It’s a huge problem, but with an easy workaround as it was already said in another post by Allyn.
I don’t see any
I don’t see any problem.
Given the relative time a GPU spend playing compared to idling, it’s much more important to constantly remind ppl that AMD cards can use somee more watts when running loaded.
MUCH more time is spent idle
MUCH more time is spent idle and under low usage compared to max usage. Its actually a massive problem for power efficiency. A crap ton of power usage is on idle electronics.
Glad to be of service. This
Glad to be of service. This has been the case since I got the first generation 144 Hz ASUS panel back in 2013. It has nothing to do with G sync. 120 Hz is the cut off, I think.
Ryan, can you try OCing the panel manually to something like 125 Hz see if you can find the barrier where the clocks spike?
Edit: Okay, wow, this is amazing.
For reference, I’m using an ASUS VG278HE and a pair of 780 Ti’s with 358.50 Drivers. Idle clocks for these cards are 324 MHz for both the GPU and the memory.
I just manually OC’d the panel to 145 Hz and the clocks/power consumption have stood the same as if it was set to 120 Hz. The panel is stable at 145 Hz while viewing youtube or even high bitrate (100k+) videos.
I think the default 144 Hz preset requests a certain frequency profile for stability, on Nvidia’s side at least.
Can you try on your end see if it’s the same story with that 980 Ti?
Very interesting but are you
Very interesting but are you absolutely sure you are actually at 145?
Absolutely. It’s at the
Absolutely. It’s at the panels limit. 146 Works half the time and 147 or higher won’t work at all.
I’m eager to see what you guys can get up to with that 165 Hz panel.
Wow again, just discovered
Wow again, just discovered something else.
If I set the panel to 120 Hz and push it to 145 Hz, clocks and TDP won’t change from the default 324 MHz, and past 145 Hz it won’t be stable.
If I set the panel to 144 Hz, I can then push it to 150 Hz, with clocks and TDP staying at elevated levels (824 MHz).
That proves the theory then. When set to the 144 Hz preset, it requests more resources from the GPU, allowing further overclocking headroom at the expense of power consumption.
As with any overclocking, your mileage will vary. I can run my panel at the advertised refresh rate but at much lower GPU frequencies, saving power with improved thermals.
I’ll be running at the “manual” 145 Hz setting from now on.
I’ve tried the same thing
I've tried the same thing here on both versions of the Swift. The original Swift overclocks to 145 / 146 here, but power draw remains the same. Also using a 980Ti.
*edit* wait, you're on a 1080P panel. That's not a bandwidth thing in your case. It actually shouldn't even be in the higher power state at 144 Hz. 1080P panels we also just tested at 144 Hz don't cause the higher consumption. We also tested 4K60 – same lower consumption.
1440p panels have 77% more
1440p panels have 77% more pixels than a 1080p panel, true, but I don’t think it’s the defining factor.
It isn’t just me. There are quite a few people I’ve seen on Overclock.net that are in the same situation with 1080p screens complaining about higher power consumption / heat when they go over 120 Hz.
You don’t have a 780 or 780 ti on hand to test see if it’s a Kepler thing?
I’ve run 4K60 on my end as well, same low idle clocks as a 120 Hz 1080p.
Again, my panel is first generation 144 Hz. It might not have been optimized compared to newer screens.
PS: All I know is manually OC’ing my panel to 145 Hz keeps it at nominal (120hz) clocks, which I’m more than happy with.
Damn Captcha keeps popping up
Damn Captcha keeps popping up when I’m trying to edit that post. I didn’t link anything, so why is it going haywire?
None of the 1080P panels we
None of the 1080P panels we tested cause a 980Ti to ramp up its clock. We're pretty sure it is related to raw pixel rate, but still testing.
have you checked what happens
have you checked what happens with weaker GPUs on lower resolutions?
Well i’ll be darned! so
Well i’ll be darned! so that’s why my gpu stays at 850 mhz & 43-44c on the desktop. i’ve been wondering this for the last 5 months since i got my gsync setup. great info pcper/cyclops.
GTX 970
rog swift
This can also happen due to
This can also happen due to having a high enough total desktop resolution, or adjusting refresh rate timing values (often required when using ‘Korean’ overclockable displays).
For me, running two displays at 1440p@120 and 1200p@60 gives idle clocks, but enabling an additional 1440p display at 60hz pushes clocks back up. Even dropping the primary display down to 60hz isn’t enough to have the clocks idle again.
As Allyn has mentioned, setting ‘Preferred refresh rate’ in the NVCP 3D settings Global tab to ‘Highest available’ is a simple way to default to a higher refresh in games compared to desktop.
I’m not having this issue at
I’m not having this issue at all. I’m on the ACER Predator xb270hu always at 144hz with a GTX980 Ti reference latest driver and latest MSI Afterburner overclocked to 1550Mhz Core and 7500Mhz memory.
I have let it on the desktop for almost the entire night and power stayed at the same level. Checked with GPU-Z
Maybe I tuned mine very well in MSI Afterburner….
Somebody should mention that
Somebody should mention that you can run at 120 Hz on Desktop and activate via NVCP the option “prefered refresh rate ->highest available”.
So there is nothing to worry about too much power consumption when surfing through the web 😉
Edit: Okay Allyn Malventano already said that.
Now that this has been tested
Now that this has been tested and brought to light I hope Nvidia fixes this soon because that’s pretty unacceptable.