Yesterday, there were several news stories posted on TechpowerUp and others claiming that ASUS and MSI were sending out review samples of GTX 1080 and GTX 1070 graphics cards with higher clock speeds than retail parts. The insinuation of course is that ASUS was cheating, overclocking the cards going to media for reviews in order to artificially represent performance.
Image source: Techpowerup
MSI and ASUS have been sending us review samples for their graphics cards with higher clock speeds out of the box, than what consumers get out of the box. The cards TechPowerUp has been receiving run at a higher software-defined clock speed profile than what consumers get out of the box. Consumers have access to the higher clock speed profile, too, but only if they install a custom app by the companies, and enable that profile. This, we feel, is not 100% representative of retail cards, and is questionable tactics by the two companies. This BIOS tweaking could also open the door to more elaborate changes like a quieter fan profile or different power management.
There was, and should be, a legitimate concern about these types of moves. Vendor one-up-manship could lead to an arms race of stupidity, similar to what we saw on motherboards and base frequencies years ago, where CPUs would run at 101.5 MHz base clock rather than 100 MHz (resulting in a 40-50 MHz total clock speed change) giving that board a slight performance advantage. However, the differences we are talking about with the GTX 1080 scandal are very small.
- Retail VBIOS base clock: 1683 MHz
- Media VBIOS base clock: 1709 MHz
- Delta: 1.5%
And in reality, that 1.5% clock speed difference (along with the 1% memory clock rate difference) MIGHT result in ~1% of real-world performance changes. Those higher clock speeds are easily accessible to consumers by enabling the "OC Mode" in the ASUS GPU Tweak II software shipped with the graphics card. And the review sample cards can also be adjusted down to the shipping clock speeds through the same channel.
ASUS sent along its official statement on the issue.
ASUS ROG Strix GeForce GTX 1080 and GTX 1070 graphics cards come with exclusive GPU Tweak II software, which provides silent, gaming, and OC modes allowing users to select a performance profile that suits their requirements. Users can apply these modes easily from within GPU Tweak II.The press samples for the ASUS ROG Strix GeForce GTX 1080 OC and ASUS ROG Strix GeForce GTX 1070 OC cards are set to “OC Mode” by default. To save media time and effort, OC mode is enabled by default as we are well aware our graphics cards will be reviewed primarily on maximum performance. And when in OC mode, we can showcase both the maximum performance and the effectiveness of our cooling solution.Retail products are in “Gaming Mode” by default, which allows gamers to experience the optimal balance between performance and silent operation. We encourage end-users to try GPU Tweak II and adjust between the available modes, to find the best mode according to personal needs or preferences.For both the press samples and retail cards, all these modes can be selected through the GPU Tweak II software. There are no differences between the samples we sent out to media and the retail channels in terms of hardware and performance.Sincerely,ASUSTeK COMPUTER INC.
While I don't believe that ASUS' intentions were entirely to save me time in my review, and I think that the majority of gamers paying $600+ for a graphics card would be willing to enable the OC mode through software, it's clearly a bad move on ASUS' part to have done this. Having a process in place at all to create a deviation from retail cards on press hardware is questionable, other than checking for functionality to avoid shipping DOA hardware to someone on a deadline.
As of today I have been sent updated VBIOS for the GTX 1080 and GTX 1070 that put them into exact same mode as the retail cards consumers can purchase.
We are still waiting for a direct response from MSI on the issue as well.
Hopefully this debacle will keep other vendors from attempting to do anything like this in the future. We don't need any kind of "quake/quack" in our lives today.
What worries me is that
What worries me is that manufacturers may start to charge for these apps or to unlock features on a product you already bought. Is the industry moving to something like this???
I don’t think companies will
I don’t think companies will do that. They will give you the free OC utilities that may invalidate your warranty for said video card. They want you to invalidate your warranty as over clocking is often done at your own risk. Nvidia cards have more built in limiters and protections in their cards versus AMD so killing your card is less likely to occur and in fact they have lower failure rate than AMD cards.
“Nvidia cards have more built
“Nvidia cards have more built in limiters and protections in their cards versus AMD”
“in fact they have lower failure rate than AMD cards”
Cite your sources, please.
Done.
http://www.pcworld.com/
Done.
http://www.pcworld.com/article/2052184/whats-behind-origin-pcs-decision-to-so-publicly-dump-amd-video-cards-.html
http://www.gamespot.com/forums/pc-mac-linux-society-1000004/why-do-amd-cards-have-such-a-high-failure-rate-29400435/
https://www.pugetsystems.com/labs/articles/Video-Card-Failure-Rates-by-Generation-563/
https://hardforum.com/threads/amds-failure-rates-part-two-2015-edition.1899524/
There were more but I think these will do. Hey you shouldn’t have asked. I wouldn’t have. I don’t just pull stuff out of my arse. I know you trolls seek to run out another Nvidia fanboy. I wonder why no other Nvidia fanboy has come to help me. Because they are like an energy efficient AMD card, they don’t exist here at least not yet. LOL
Congratulations! You finally
Congratulations! You finally managed to demonstrate that one of your claims has merit! (Except for your second link, which is entirely made up of anecdotal evidence, some of which supports your premise and some of which opposes it.)
Now if only you could do that for all of your other claims.
Can you post your links to
Can you post your links to prove I’m wrong. Yes some stuff is speculative but then again most people’s stuff is opinion. I dare you find me wrong with subjective proof not fanboy opinion.
I don’t believe you saw me
I don’t believe you saw me claim that you were wrong.
But since you’re using anecdotal evidence to support your argument even though you didn’t need to, I’ll give you a little bit more. Since my first discrete graphics card, a GeForce 2 Ti, I have owned and used a lot of graphics cards. I have never had an ATi or AMD graphics card fail. I have had several Nvidia graphics cards fail.
Clearly, my experience has not fallen within the experiences of Puget or whatever that French source was. Perhaps my case has been exceptional. But you’re taking anecdotal evidence to support your claim (and ignoring anecdotal evidence that doesn’t) so… there’s a little more anecdotal evidence for you to ignore.
As far as the protections and
As far as the protections and limiters just look at any review of 1080/1070 and how complicated they are to overclocker’s. But I think you really wanted the failure rate info.
Actually, to be perfectly
Actually, to be perfectly honest, this was the one I was more interested in.
I’m now even more interested in how, “reviewers are having trouble overclocking Pascal” somehow translates into, “Nvidia cards have more built in limiters and protections in their cards versus AMD.”
Since you clearly know so much about it, why don’t you tell us about some of these “limiters and protections” that Nvidia has? Why don’t you show us where you learned this information? I don’t want to know what review you read that made you assume that Nvidia must have more “limiters and protections” I want to know about what these specific limiters and protections are.
You’re claiming that Nvidia cards have more “limiters and protections” than AMD, so clearly either you know all of the limiters and protections in both companies’ cards and how they’re different, or you read an article somewhere that told you so.
If you’re right and I’m wrong, then I’d sure like to see the correct information so that I won’t be wrong again.
But regardless of what you believe, stating something as a fact and hoping that people will just believe you does not make that statement a fact.
For one you can’t raise the
For one you can’t raise the power limit on Nvidia cards more than 15% on AMD you can raise it a whole 50%. You can modify your Nvidia card by shorting out resistors on the card some have 3 or 4. They want to make sure it’s nearly impossible to fryyour/ burn up your card like happened with what series of video cards. Help me fanboys. Anyway you can find that technical yourself. Nvidia’s throttle at a lower temperature than AMD cards which the 290x was at 95C ours are usually in the low 80’s. Again this is common knowledge so shouldn’t need cite. Have you never read a video card review in your entire life. One would seem not as you’re asking for sources for everything. Nvidia is called the green team. Must cite source. I’m exaggerating but if you can’t prove me wrong from doing your own research then maybe it’s the truth. So instead of implying I’m wrong prove it.
Yes, you’re right, you can
Yes, you’re right, you can raise the power limit on AMD cards a lot more than Nvidia cards. But guess what? Both brands themal-throttle if you overdo it. You can turn the power limit on an AMD card up to +50% but it won’t get to that point if it thermal-throttles before it gets there. So they both have that preventive measure.
You can turn up the thermal-throttle temperature threshold on Nvidia cards, can you not? Pretty sure I’ve seen it.
So they both thermal throttle if they get too hot. Yes, this is common knowledge. What you didn’t bother to mention is what other sorts of protections Nvidia has that AMD doesn’t have.
Why am I asking for sources for everything? Because you’re making statements that either I know to be bald-faced lies, or that I don’t think you actually know to be true. YOU are the one making the claim. Either you learned it from somewhere, which means you should be able to show where you learned it, or you’re making it up, stating it as a fact, and hoping people believe you. Either way, YOU are the one making the claim.
OK I’m feeling generous
OK I’m feeling generous today.
http://nvidia.custhelp.com/app/answers/detail/a_id/2857
http://www.guru3d.com/articles-pages/msi-geforce-gtx-1080-gaming-x-8g-review,39.html
I dare you to say Hilbert Hagedoorn is wrong. He’s a professional and you’re what exactly?
You really suck at
You really suck at this.
First link – outlines one new power monitoring feature on an Nvidia card 6 years ago. Makes no mention of AMD power monitoring features at all (not surprising as it’s an Nvidia customer help question) and certainly doesn’t prove that Nvidia “has more limiters and protections” than AMD.
Second link – only makes one mention of what you’re talking about. “Due to the many limiters and hardware protections Nvidia has built in”. That’s it. Also makes no mention of AMD power monitoring features at all and certainly doesn’t prove that Nvidia “has more limiters and protections” than AMD.
One would assume there is
One would assume there is based on the higher failure rates of AMD cards compared to Nvidia. It is probably related to higher temps possibly less protections in place. AMD is the only ones who had water cooling standard with video cards and processors. Why they need it because of heat. Heat kills electronics. If you’re looking for a hard number I can’t give it to you. Use deductive logic for a change. It isn’t a big assumption to make.
Oh but farther down thread,
Oh but farther down thread, discussing two RX-480s in CF, deductive logic was completely unacceptable to you. Now you demand it? Hypocrite much?
“AMD is the only ones who had water cooling standard with video cards and processors. Why they need it because of heat.”
This statement is based on assuming that “they need it because of heat,” an assumption that is proved false by the number of people who have unlocked their vanilla non-X Fury cards (specifically Asus and Sapphire Tri-X) to full-on Fury X cards, with their air coolers intact.
(because I apparently botched
(because I apparently botched the link tags again…)
http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool
“One would assume there is
“One would assume there is based on the higher failure rates of AMD cards compared to Nvidia. It is probably related to higher temps possibly less protections in place. ”
Oh, so now you’re assuming that higher failure rates of AMD cards is probably related to higher temperatures and possibly less protections in place.
In other words, you have no idea. You’re making it up as you go, and stating it as if it’s well-known fact in the hopes that no one will challenge you.
Hardly surprising, that’s been your modus operandi everywhere in this comment section.
If you think your miracle
If you think your miracle card Rx 480 can beat the 1080 with two cards it will be in spitting distance of the newborn $1500 Redeon Pro Duo for less than $500. What a crappy value it would make this card. Instant obsolescence. Even I don’t think AMD would be that stupid. Yes, you’ll sell way more bargain cards but you don’t know AMD’s cost per unit. More profit per unit in upper tier cards.
The Radeon Pro Duo was
The Radeon Pro Duo was obsolete the second it hit the market. That cards was supposed to release 6+ months before we finally saw it. I think it was potentially just one of those instances where they went through process and spent the money to make it that their two options were to just write it off completely, or try to sell a couple of them to make a few bucks back.
You’re right Ted. I agree
You’re right Ted. I agree with you here.
Right now you can buy two R9
Right now you can buy two R9 Nano cards for less than $1000 and be within spitting distance of the $1500 Radeon Pro Duo.
Some rumors have the RX-480 on par with the R9 Nano.
If there is truth behind this rumor, then two RX-480s for less than $500 ought to be within spitting distance of the $1500 Radeon Pro Duo.
And if two RX-480s for $500 are within spitting distance of the $1500 Radeon Pro Duo then they ought to be ahead of the GTX 1080.
The only argument you have against such an eventuality is, “But that’s bad for the Radeon Pro Duo!” Really?
Amending:
Some rumors have
Amending:
Some rumors have the RX-480 on par with the R9 Nano.
Not necessarily. The 1080 has
Not necessarily. The 1080 has compute comparable to a Furyx and different architecture. Rx 480 rumoured to be in 5 tflop range. It can be like a Nano and still not come close to a 1080 especially in Nvidia optimized games. 1080/1070 do way better in Ashes because their compute is comparable instead of being way weaker than AMD cards. Volta probably will probably have so much compute AMD will choke on it. They will have themselves to blame by overly emphasizing it in their games and directx 12.
Aww, look at you, moving the
Aww, look at you, moving the goalposts again to try to make yourself feel like you’re right, answering a citation-full and logical comment with assumption and conjecture.
Two Nanos in CF score extremely close to the Pro Duo. I cited that source.
The RX-480 is rumored to be in the same performance range as the Nano. I cited two sources.
So if the RX-480 is comparable to the Nano, and two Nanos are just barely behind Pro Duo, why wouldn’t two RX-480s be comparable to two Nanos, and thus just barely behind the Pro Duo?
None of my sources referenced compute. I did, however, give you a source (PCPer) that showed that two Nanos are just slightly behind the Pro Duo. I also showed you FireStrike Ultra scores that showed the Pro Duo beating the 1080 by 2,054 points on the Graphics score.
(Bonus: here’s an example of two Nanos in CF beating the 1080 by more than 2300 points and beating the Pro Duo by 377 points, and they did it on an FX-9590 instead of the i7-5960x that the 1080 and Pro Duo that I referenced.)
I’ve given you actual hard numbers. The only assumption was, “IF the RX-480 is comparable to the Nano”.
Try addressing what was actually said.
That’s a big assumption to
That’s a big assumption to make. And what exactly are you trying to prove. Firestrike is well optimized for multi cards. So far support for directx 12 is lacking. Not saying they won’t get it going but don’t hold your breath. A lot of dx12 requires the programmers to do more work for video card support. Most couldn’t manage to get a decent game out the door for dx11 and there is more work for them in dx12. That being said I’m still using Win 8.1 no need for 10 for me. Nano can almost become a Furyx you just have to raise its power limit 50%. Everything is rumor about Rx 480. I could post an outlandish rumor regarding its performance and fanboys would eat it up. You think AMD is going to take down rumors that make people want to buy it by exaggerating it’s performance. Get real. The only thing we know for sure is it beat a 1080 in a heavily biased AMD bench/game AOTS with two Rx 480s. If you don’t believe this is a biased test than I can’t reason with you. I won’t try anymore. Everything else about Rx 480 is fud until proven.
Still moving the goalposts, I
Still moving the goalposts, I see. Don’t you remember up above when you tried to use FireStrike scores to prove that AMD’s review cards are overclocked more than their consumer-release cards? (And used two completely different platforms, making the comparison invalid.)
Now FireStrike isn’t acceptable to you?
“That’s a big assumption to make.”
YOU started this subthread with an assumption. “If you think your miracle card Rx 480 can beat the 1080 with two cards it will be in spitting distance of the newborn $1500 Redeon Pro Duo for less than $500.”
As you pointed out, everything about the RX-480 is rumor right now, so nothing can be accepted as 100% fact. But IF the rumor is correct that the RX-480 is comparable in performance to the R9-Nano, then yes, it would follow that two RX-480s would be comparable to two R9-Nanos. And as actual reviews have shown, two R9-Nanos are in the same performance range as the Pro Duo. And you were shown FireStrike scores of the Pro Duo beating the 1080.
With the exception of the premise, “If the RX-480 is comparable to the Nano,” which is A, the topic of discussion, and B,your premise, everything I said was backed with evidence.
Your response is, “But but but DX12! And that’s just a rumor – never mind that this whole topic came from a hypothetical based on that rumor! Where can I move the goalposts now?”
No one said the 1080 was 100%
No one said the 1080 was 100% better than any fury card. Why do you AMD fanboys always have the need to say your dual card is the best video card. It will be dead last in performance per watt which is a better metric to measure by. Also going to be lower than 1080 in value for performance. Not sure but seems you’re jealous that the 1080 is the fastest single GPU card in the world at this time. Deal with it. If you love the Pro Duo so much buy one at $1500 you can buy two founders editions for that price and smoke it. AND kill it in performance and efficiency as well.
Listening to you scramble to
Listening to you scramble to find a coherent argument is hilarious. Truly.
“No one said the 1080 was 100% better than any fury card.”
You’re right. No one said that. What does that have to do with anything?
“Why do you AMD fanboys always have the need to say your dual card is the best video card.”
Well for starters, you’re accusing the guy with two 980s in SLI as being an AMD fanboy, and that’s hilarious. Furthermore, I never said that either. I said it beats the 1080. Which it does.
“It will be dead last in performance per watt which is a better metric to measure by. Also going to be lower than 1080 in value for performance.”
Yeah. So? Like with so many of your comments, what does that have to do with the discussion? You seem to be forgetting how this discussion started. YOU wrote: “If you think your miracle card Rx 480 can beat the 1080 with two cards it will be in spitting distance of the newborn $1500 Redeon Pro Duo for less than $500.”
Thus, the discussion is about two RX-480’s and whether they are (or can be) as fast as the 1080. The Pro Duo only enters the conversation because it’s the only thing you can think of to suggest that AMD would never make the RX-480 that powerful.
“Not sure but seems you’re jealous that the 1080 is the fastest single GPU card in the world at this time.”
Nope. My system is already overpowered for what I do. Why would I be jealous of the 1080?
“If you love the Pro Duo so much buy one at $1500 you can buy two founders editions for that price and smoke it. AND kill it in performance and efficiency as well.”
Not at all surprising that the best argument you can think of is a straw man fallacy.
How is it a fallacy two 1080s
How is it a fallacy two 1080s would kill it being that 300 some points is all the Pro Duo beat a single card by. Logic tells me this. On the 29th we’ll find out who is right. I’m saying two rx480s will not beat a single 1080 when averaged over a wide selection of games and benchmarks. Techpowerup does a thorough job of reviews. It’s hilarious because how would I know that you have two 980tis. Thinking of trading them in for two RX 480s? Do you own stock in AMD? Then WTF would you care.
“How is it a fallacy”
“If you
“How is it a fallacy”
“If you love the Pro Duo so much” This is a Strawman fallacy.
“being that 300 some points is all the Pro Duo beat a single card by.”
You mean, of course, more than 2000 points, right?
“It’s hilarious because how would I know that you have two 980tis.”
I don’t, I have two 980s. And you wouldn’t know that. That’s exactly my point. You make up your mind about who’s an AMD fanboy and who isn’t when, in reality, you know nothing.
“Thinking of trading them in for two RX 480s? Do you own stock in AMD? Then WTF would you care.”
No. And no. But I’m sick of sad, obnoxious little fanboy children like you coming across as representative of actual Nvidia users. Believe it or not, most of us hate you fanboy trolls because you make us look bad. You make Nvidia fans look like we’re all lying, whiny little babies who base our entire sense of self-worth on what brand of graphics card we buy.
By the way,
“I’m saying two
By the way,
“I’m saying two rx480s will not beat a single 1080 when averaged over a wide selection of games and benchmarks.”
And I’m saying that even if two RX-480s don’t beat a single 1080 when averaged over a wide selection of games and benchmarks, right now all of the leaks out there suggest that it will be close enough to be competitive. For significantly less money.
I’ll actually do you one
I’ll actually do you one better. I’ll make you a wager if your Rx 480 can’t actually do what you’re implying it does. Leave this site and never post here again. If it does I’ll gladly admit I’m wrong and leave and never post here again. Put up or shut up.
The only caveat is it can’t just be one benchmark. An equal mixture of Nvidia and AMD games along with synthetic benchmarks which are less biased.
I might be willing to take
I might be willing to take that wager if I actually believed you’d hold up to your side of the bet. But I don’t. You don’t have a track record of being able to admit when you’re wrong. And you’ve been wrong a LOT.
Don’t worry I hold up to my
Don’t worry I hold up to my bargains. What’s the matter chicken? In fact you won’t see me post again on this site until after a review of the Rx 480 is on this site. Have fun antagonizing someone else with your BS about goalposts. The first assumption wasn’t based on fact and thus cannot be supported by anything until more facts are known about the Rx 480. Everything else is conjecture.
Don’t worry I hold up to my
Don’t worry I hold up to my bargains. What’s the matter chicken? In fact you won’t see me post again on this site until after a review of the Rx 480 is on this site. Have fun antagonizing someone else with your BS about goalposts. The first assumption wasn’t based on fact and thus cannot be supported by anything until more facts are known about the Rx 480. Everything else is conjecture.
Don’t worry I hold up to my
Don’t worry I hold up to my bargains. What’s the matter chicken? In fact you won’t see me post again on this site until after a review of the Rx 480 is on this site. Have fun antagonizing someone else with your BS about goalposts. The first assumption wasn’t based on fact and thus cannot be supported by anything until more facts are known about the Rx 480. Everything else is conjecture.
“In fact you won’t see me
“In fact you won’t see me post again on this site until after a review of the Rx 480 is on this site.”
Good.
Yeah and both the 9590 and
Yeah and both the 9590 and the nanos were OC’d to hell. Probably OC’d the hell out of system ram and bus speed and anything else he could. So it doesn’t prove anything. The fact that the pair of Nanos beat the Pro Duo proves that. Someone’s max overclock is something 99% won’t achieve.
Are you suggesting that the
Are you suggesting that the OC on the Nanos gave it a boost to its graphics score of more than 2700 points?
Or that a 300MHz overclock on an FX-9590 puts it so far ahead of an i7-5960X that it will artificially inflate the dual Nano’s graphics score?
Are you suggesting that the 1080 being compared against, running 2177MHz GPU and 1451MHz memory, wasn’t overclocked as well?
Okay, , significantly less overclocked, on a Skylake 6700K.
Damn, screwed up the link
Damn, screwed up the link tag. Let’s try that again:
Okay, here’s two R9 Nanos, significantly less overclocked, on a Skylake 6700K.
Come to think of it, the 9590
Come to think of it, the 9590 wasn’t overclocked. The 9590 is a 5 GHz chip.
Don’t worry I hold up to my
Don’t worry I hold up to my bargains. What’s the matter chicken? In fact you won’t see me post again on this site until after a review of the Rx 480 is on this site. Have fun antagonizing someone else with your BS about goalposts. The first assumption wasn’t based on fact and thus cannot be supported by anything until more facts are known about the Rx 480. Everything else is conjecture.