Yesterday, there were several news stories posted on TechpowerUp and others claiming that ASUS and MSI were sending out review samples of GTX 1080 and GTX 1070 graphics cards with higher clock speeds than retail parts. The insinuation of course is that ASUS was cheating, overclocking the cards going to media for reviews in order to artificially represent performance.
Image source: Techpowerup
MSI and ASUS have been sending us review samples for their graphics cards with higher clock speeds out of the box, than what consumers get out of the box. The cards TechPowerUp has been receiving run at a higher software-defined clock speed profile than what consumers get out of the box. Consumers have access to the higher clock speed profile, too, but only if they install a custom app by the companies, and enable that profile. This, we feel, is not 100% representative of retail cards, and is questionable tactics by the two companies. This BIOS tweaking could also open the door to more elaborate changes like a quieter fan profile or different power management.
There was, and should be, a legitimate concern about these types of moves. Vendor one-up-manship could lead to an arms race of stupidity, similar to what we saw on motherboards and base frequencies years ago, where CPUs would run at 101.5 MHz base clock rather than 100 MHz (resulting in a 40-50 MHz total clock speed change) giving that board a slight performance advantage. However, the differences we are talking about with the GTX 1080 scandal are very small.
- Retail VBIOS base clock: 1683 MHz
- Media VBIOS base clock: 1709 MHz
- Delta: 1.5%
And in reality, that 1.5% clock speed difference (along with the 1% memory clock rate difference) MIGHT result in ~1% of real-world performance changes. Those higher clock speeds are easily accessible to consumers by enabling the "OC Mode" in the ASUS GPU Tweak II software shipped with the graphics card. And the review sample cards can also be adjusted down to the shipping clock speeds through the same channel.
ASUS sent along its official statement on the issue.
ASUS ROG Strix GeForce GTX 1080 and GTX 1070 graphics cards come with exclusive GPU Tweak II software, which provides silent, gaming, and OC modes allowing users to select a performance profile that suits their requirements. Users can apply these modes easily from within GPU Tweak II.The press samples for the ASUS ROG Strix GeForce GTX 1080 OC and ASUS ROG Strix GeForce GTX 1070 OC cards are set to “OC Mode” by default. To save media time and effort, OC mode is enabled by default as we are well aware our graphics cards will be reviewed primarily on maximum performance. And when in OC mode, we can showcase both the maximum performance and the effectiveness of our cooling solution.Retail products are in “Gaming Mode” by default, which allows gamers to experience the optimal balance between performance and silent operation. We encourage end-users to try GPU Tweak II and adjust between the available modes, to find the best mode according to personal needs or preferences.For both the press samples and retail cards, all these modes can be selected through the GPU Tweak II software. There are no differences between the samples we sent out to media and the retail channels in terms of hardware and performance.Sincerely,ASUSTeK COMPUTER INC.
While I don't believe that ASUS' intentions were entirely to save me time in my review, and I think that the majority of gamers paying $600+ for a graphics card would be willing to enable the OC mode through software, it's clearly a bad move on ASUS' part to have done this. Having a process in place at all to create a deviation from retail cards on press hardware is questionable, other than checking for functionality to avoid shipping DOA hardware to someone on a deadline.
As of today I have been sent updated VBIOS for the GTX 1080 and GTX 1070 that put them into exact same mode as the retail cards consumers can purchase.
We are still waiting for a direct response from MSI on the issue as well.
Hopefully this debacle will keep other vendors from attempting to do anything like this in the future. We don't need any kind of "quake/quack" in our lives today.
I think that the idea that a
I think that the idea that a 1.5% OC merits the title of an “OC profile” at all is laughable. No reviewer or power user worth their salt would even accept the “OC profile” blindly and not try to push further anyway.
As an enthusiast myself, I’m not surprised at all by the practice, mainly their poor execution of it.
ryan, this issue had happened
ryan, this issue had happened few years ago. eg. jayztwocents received MSI & GIGA review sample gpu are slightly higher overclocked by default compare to the retail sample & same with others reviewer too.
literally, this thang is not a new story anymore. by the way, y reviewers can’t detect the 3.5 GB VRAM fiasco at the 1st place?
there’s a question mark on all the reviewers for sure.
can those reviewers from youtube stop b-roll & talk bout the paper spreadsheet that we can read ourselves for the specification?
a real proper tester will dismantle the sample & look for the area that need improvement etc.
guys, don’t be so dramatic (k-drama) literally, i know u guys earn for living & we won’t blame u guys. honesty is the best policy. cheers~
” by the way, y reviewers
” by the way, y reviewers can’t detect the 3.5 GB VRAM fiasco at the 1st place?”
Because it had, and still has, no performance impact in any playable scenario (i.e. the only times you can measure any performance impact is when you turn setings up so far they are well beyond unplayable even on the 980).
That is bullshit. I have a
That is bullshit. I have a 970 and nearly every modern game is pegged at 3.5GB now and I see stuttering when new textures stream in that doesn’t occur on cards with 4GB. When I had SLI 970s it was way worse as I had the GPU power to push the higher settings, but not the VRAM.
You are also forgetting one of the worst parts of the 3.5 fiasco: the memory bandwidth specs were wrong on launch. I bought two cards thinking they had the same memory subsystem as the 980s, and they didn’t. I wish AMD were able to sue as a lost volume seller, but I guess that didn’t happen.
Shut the fuck up faggot.
Shut the fuck up faggot.
Whoa, check out this verbal
Whoa, check out this verbal beat down. Truly an intellectual powerhouse here. Good argument.
Yeah, it’s getting old having
Yeah, it’s getting old having these useless commenters around. Why doesn’t PCper require registration to comment?
When young people ask me where I get PC info and news from, I hesitate to point them here because of the comment section. The content is otherwise clean; it’d be great to cleanup the comment section.
Because that wouldn’t solve
Because that wouldn’t solve the problem. Every similar tech site with required registration has it’s share of shitposting as well.
I’m sure it still happens
I’m sure it still happens with registration alone, but add a decent set of rules, combined with minimal effort by some mods, and surely one could keep out the people that post purely to offend or antagonize.
They could use a moderator
They could use a moderator who can remove posts that are obviously made by kids who discovered internet for the first time.
or set up some sort of voting system that lets registered users vote to remove obnoxious comments or spam. I think pcper community is mature enough for self-policing to work.
Because even on sites which
Because even on sites which i’m registered on, I cannot be arsed signing in just to put my oh so important opinions across.
Signing in does not make you special, except maybe for in your own head.
This is not the first time
This is not the first time Pcper staff has been seen Asskissing brands. These assholes posted rumours about Amd and then never appologised for it. This site seems to be taking money from manufacturers for posting positive reviews and news.
Ryan was viewed as one of the
Ryan was viewed as one of the AMD sites likely to give the Furyx a fair review if I remember correctly. He got one of the limited Furyx golden samples to review and send back. It’s been a while I’m not 100% certain on this. Take that with a grain of salt before you decide where his allegiance lies.
Yes he did get one on the
Yes he did get one on the launch date of June 24. Sorry to burst your bubble. One site didn’t get one because they referred to the 300 series of cards as “Rebranding”. This is the tactics of a company you choose to support. Along with gag order until RX 480 is availiable for sale. if it was anything worthwhile AMD would be singing praises off the mountaintops. Sorry my opinion is Polaris line is gonna disappoint especially since they moved Vega launch date up. It’s what they call a tell in poker terms.
It’s easy to point the finger
It’s easy to point the finger at AMD when all the tech press is trying really hard to convince you that your Nvidia card that you just bought, or the one you are thinking to buy, is a superior product with no problems. If tech press was attacking Nvidia for no real reasons and was covering up AMD and their mistakes, believe me, your perspective will be the opposite.
Anyway, keep supporting a company that tries to create a monopoly where NO other company will be able to compete.
The attack on Nvidia was and
The attack on Nvidia was and is still relentless over the 3.5/4 gig question of 970. I will find the site and post a link of the 970 loading 4.1 gigs of texture in Hitman. Shouldn’t be possible as it only has 3.5 gigs right. Wrong it has texture compression and it’s an AMD game that purposely loads more into the ram of Nvidia cards vs. AMD.
“The attack on Nvidia was and
“The attack on Nvidia was and is still relentless over the 3.5/4 gig question of 970.”
In comment threads, yes. In reviews, no.
AMD is trying to force Nvidia
AMD is trying to force Nvidia out of business by getting all their hacks, cheats, and helpers in Directx12 as well as trying to get all the dx12 games they can and optimize it their way and not do a whit of optimizing for Nvidia. Here is the link.
http://www.guru3d.com/articles_pages/hitman_2016_pc_graphics_performance_benchmark_review,9.html
Eye opening isn’t it. All Nvidia has to do at the bogus trial is submit this and class action lawsuit is denied.
AMD John please look
AMD John please look carefully at 4k resolution where the Furyx sits below 4 gigs of RAM and 970 slightly above 4 gigs. You probably don’t want educated in the evils and lengths your company stoops to to try and scam you into buying their latest duds. Wouldn’t want that Furyx to stutter but it’s OK to do it to the 970. Both Ashes and Hitman have low sales as a consequence. This type of behavior will slow as Nvidia has around 75%+ of the market. If a game performs too poorly on a Nvidia card, people will not buy it. They better bank all the money AMD gives them because they’ll go out of business as a result.
Regardless of how much memory
Regardless of how much memory the 970 has, the reality is that the 970 just does not have the horsepower to really drive games at 4k. Already experienced it and I know first hand that just because the 970 can run games at 4k doesn’t mean it should because settings have to be turned way down to get any kind of acceptable frame rate.
You want to know the truth about game development for competing architectures? They will sometimes use varying amounts of memory! gasp – mind blown! You want to blame someone for a 970 using more than 4GB of memory? that is purely on the developers shoulders, not Nvidia, not AMD, all developer’s fault.
Yes but when AMD sponsors
Yes but when AMD sponsors said game Hitman. The developer may resort to dodgy tactics to help their benefactor out or at least not optimally code for rival. This seems to be the case because the game does not perform well for Nvidia in both directx 11/12. Nvidia is usually known for having much better dx11 performance. Nvidia always gets blame in this regard all the time remember Crysis 2 they say. But the main point is a 3.5 gig card shouldn’t be loading over 4 gigs of texture. Yes I am running 4k on my 4gig 760. Older games do OK but yes performance in newer stuff is lacking. You are correct in stating it is the developer but sponsorship may directly or indirectly influence the game. All I’m saying is in the newer games that AMD have sponsored so far they are killing Nvidia. Whereas in some games Nvidia sponsors AMD gets close performance or actually beats them in frame rate. IDK seems like something is screwy going on. Especially when you have $300 mainstream AMD cards beating $650 enthusiast level cards from Nvidia.
“The developer may resort to
“The developer may resort to dodgy tactics to help their benefactor out or at least not optimally code for rival.”
Gee, where have we heard that before?
“Whereas in some games Nvidia sponsors AMD gets close performance or actually beats them in frame rate.”
You forgot to add, “while in the vast majority of games Nvidia sponsors, they are killing AMD.”
“optimize it their way and
“optimize it their way and not do a whit of optimizing for Nvidia”
And yet when AMD fans complain about Nvidia-sponsored games running like ass on AMD cards, you Nvidia fanboys love pointing out that “it’s not Nvidia’s job to optimize anything for AMD.”
Eye opening, isn’t it?
Er, no, probably not for you actually.
You lie..lie..lie…
I have
You lie..lie..lie…
I have has 970 SLI since roughly 6 weeks after launch. I have not had any stuttering in any game at playable settings.
I had mine from launch till I
I had mine from launch till I sold one a few weeks ago. I know what I saw. I know it went away when I turned the graphics down such that the VRAM usage wasn’t pegged at 3.5GB, and I am keenly aware that it was specific to my 970s as my wife’s computer was not experiencing the same stuttering in the same games despite her computer having a slower CPU, less ram, and games running on HDDs instead of SSDs.
I’m happy for you that you don’t have issues with the 970s, but I did. As I’ve said in many comments before, the stuttering wasn’t even my least favorite “feature” of the 970s either. I’ve run various SLI setups since a few weeks after the GTX 280s launched and never had as many issues as I had with the 970s. The cards were fraudulently advertised, the SLI driver support dwindled, they stutter when you max the VRAM, they inexplicably reset my monitor setup on each “game ready driver”, their DX12 performance is laughable (my Xbox plays Quantum Break better, and a R290, which I now regret having not purchased gets nearly twice the FPS), they had more microstutter than even my 280s in 2008 did, I’ve had more “driver stopped responding” CTDs on my 970 setup than all my other computers put together, two driver updates have screwed up my windows install such that I had to boot into safe mode to fix it… I could go on, but I have better things to do. If you don’t want to see stuttering with your 970s, run them at 1080p or below on DX11 games only, never turn up AA too much, and don’t watch evga precision or any such thing that shows you when to look for the stuttering (because the VRAM is pegged at 3.5).
The amount of change is not
The amount of change is not important but the principle is!
Gj, Ryan and I’m waiting for MSIs answer.
Kudos and to all other tech reporters, as well, and I really mean it!
Such things do really help us, consumers, in the long run and I will definitely take this into account when purchasing my next GPU, and motherboard and…etc. for that matter.
I agree! It’s good to have
I agree! It’s good to have integrity and to call BS out. If you don’t do it, corruption grows until suddenly it’s everywhere and it’s too big and strong to actually do something about.
Ryan, why the flip-flop of
Ryan, why the flip-flop of opinion all of sudden?
On your Twitter you didn’t feel like there being a 1.5% difference between retail and review cards was a big deal and that there was no problem with it.
What’s with the faux concern (or so it seems) all of a sudden?
On Twitter I said it was dumb
On Twitter I said it was dumb to claim a review of this card before today was "invalid" because of this clock speed difference.
Every single msi and asus
Every single msi and asus review using off the offending bios should be removed from the web.
Ryan should really be leading the pack and pushing for this consensus among North American sites.
If you want the readership respect back, need to start taking control of the relationship between reviewers and brands.
Power inbalance has swung to far, many readers now consider NA “press” “journalists”.. extensions of brand X PR.
Ryan, help change that view.
I don’t see malice here but a
I don’t see malice here but a poor decision. If a reviewer doesn’t know how to OC a card, they probably shouldn’t be reviewing.
Well, that’s fine, it just
Well, that's fine, it just needs to be very obvious and spelled out if there any intentional differences between the retail and review cards.
All reviewers should be
All reviewers should be honest about the way the test cards.
It would be interesting to see the difference if this 1.5% is less then a performance hit as appose to a case test.
A disclaimer for open air test beds. I’m sure less people run open air test beds then people ran SLI and when Nvidia is no longer supporting it the crowed cheered because it was a small minority, why cant reviewers recognize they aren’t testing things they why the will be used by the majority of the users out there.
The whole purpose of a blower
The whole purpose of a blower style “reference” card is to expel air out of a case. Testing these cards and all cards in general in an open air test bed which far less people have then 3 & 4 way SLI is just plain lazyness on reviewers part.
I would like to see a average case being tested with such hardware not some 16 fan case. Stuff that’s actually sold to the majority of the people. Not some over the top method that a very small percentage of people have.
https://www.youtube.com/watch
https://www.youtube.com/watch?v=Jb6C6fd8tGc
Jay just did a whole video comparing an open air test bench and enclosed case. It doesn’t matter. At all. Now you know, so stop complaining about open air tests.
I think that it was kind of a
I think that it was kind of a shady technique for ASUS to do this, if anything they should have been upfront about it from the beginning.
Speaking of VBIOS’s in general, is there a good way to edit VBIOS’s to say up the clocks?
So 23mhz and 13 mhz
So 23mhz and 13 mhz difference in GPU and memory clocks respectively, that can be adjusted by the end user and are based on standard profiles available to everyone with the included software. I guess I don’t see a problem with it. It’s not like they sent the reviewer cards with 2ghz GPU clocks or something.
Well, I don’t quite believe
Well, I don’t quite believe what ASUS has to say. They really do seem to be trying to artificially inflate benchmarks and review scores, and they really just should’ve been upfront and just said “We’d like you to show off the factory-available OC mode too”
I’m glad this was called out. Sure it may be small mhz differences now, but if they got away with it, they’d try to do more and more, with hardware and software, until we’d actually probably end up with real bait and switch scenarios, where reviewers get the best-binned stuff with special bnenchmark-only settings designed to cheat as much as possible.
I don’t think people ever though a company as big and well-respected as Volkswagen would ever build cheat software into it’s Diesel cars before it was found out, but they did, and I honestly don’t think other companies would be different if they thought they could get away with it.
Well, I don’t quite believe
Well, I don’t quite believe what ASUS has to say. They really do seem to be trying to artificially inflate benchmarks and review scores, and they really just should’ve been upfront and just said “We’d like you to show off the factory-available OC mode too”
This. +++
Take the review sample and
Take the review sample and have it measured against a large enough number of retail samples, and if the review sample is able to be overclocked higher than the average of all retail samples then it’s most likely a cherry picked part with the ability to be clocked a little higher and not show much of a thermal/power usage difference between it and the retail cards.
These manufacturers should not be allowed to benchmark their own products and present the results to the public. The Manufacturers should be required to send their direct marketing benchmark review samples to an independent testing lab with the manufacturer required to use those results for any advanced marketing purposes. There can still be third party/press reviews of the parts, but those manufacturer made reviews/benchmarks need to be done by an independent lab that uses all the available games/benchmarking software with the independent lab and the manufacturer required to publish all the independent lab’s results(No Cherry picking of the results allowed).
Press review samples should be randomly chosen, with the press/review outlet chosen by random lottery, to prevent the manufacturers from holding the press hostage and in fear of not getting a review sample for any honest factual criticism of a maker’s product. The press should also have to run the same wide selection of games/Benchmarks with no cherry picking of results allowed.
Let me add to my previous
Let me add to my previous comment. I am glad that this was called out, because reviewers need to keep manufacturers honest. I would like to see if these cards were cherry picked and suggested above, but it’s such a small overclock…. I just don’t know if it really matters much. And it’s not like the clocks are hidden, it’s obvious when you look at a review what the clock speeds are.
Ryan, I appreciate your
Ryan, I appreciate your articles and podcasts but perhaps you could clarify something in this article for me. You seemed more negative about the CPU base clock delta than the GPU delta. But as shown in this article they were both 1.5% ???
THIS IS FUCKING STUPID GET A
THIS IS FUCKING STUPID GET A LIFE LOSERS.
It is hard enough to get a
It is hard enough to get a life in games, especially if you are a loser. It is even harder in real life. But I do get your point.
Wow. The penetrating insight
Wow. The penetrating insight displayed in this comment has made me completely re-think my world view. I can see so much more clearly now.
Thank you for writing this thoughtful, intelligent, and reasoned comment. You have solved all of the problems in my life, and I will likely never be able to express my gratitude.
Poor excuses. 15 years ago
Poor excuses. 15 years ago sites testing motherboards where starting their reviews with the front side bus speed knowing that some manufacturers overclock it 1-2% so to get the topspot on the charts.
While in graphics cars 1-2% will make much less of a difference, the idea is the same. That 1-2% coudbe the mior difference that can move their model one place higher on the charts. That would mean more sales.
Everything else is poor excuses.They just thought no one will notice, or care to mention something that looksso insignificant.
I’m wondering if PCPer (and
I’m wondering if PCPer (and every other independent reviewer) has known about this going on and didn’t think it significant, or if it really took them ~3 years to notice that their cards are running above the advertised base clocks. I’d like to think that I can trust independent reviewers to keep manufacturers honest, and while bringing it up now is good, not noticing for ~3 years really concerns me as to what else is being missed.
I know that computer hardware reviewing is not investigative journalism of the highest importance but I do expect more than just taking what the manufacturer tells you at face value.
In overall, this is looks bad
In overall, this is looks bad for card makers. Even put the whole situation on Nvidia as well. Any misleading information can kill a good reputation.
This type of scrutiny better
This type of scrutiny better happen when AMD shops around their cherry picked RX 480/470/460 samples and takes them back from the reviewers to give to the next. Reviewers know this is going on but few say anything. There is a lot bigger difference between AMD review samples and retail samples than 1-1.5%. This is nothing. Super clocked cards with higher over clock out of box will easily beat these “cheater” cards. AMD gets carte blanche to cheat because they are the underdog. AMD used their console dominance to get a very optimized for AMD directx 12. It happens everytime there is a new one. Nvidia loses for a year until they design a card that is fully capable and kicks AMD’s butt. AMD certainly does not inform the media they are getting cherry picked cards so why should MSI and Asus tell them their OC bios is enabled by default.
It did happen with AMD
It did happen with AMD sending specially prepared press 290x cards already.
Also the entire Fury line was
Also the entire Fury line was cherry picked and sent to reviewers who were AMD biased. AMD said they would only send them to a site where they would get a fair review. Tech PowerUp didn’t get one because their review machine uses a 4770k and not an enthusiast class Intel. AMD needs the more powerful processor to get higher frame rates.
“There is a lot bigger
“There is a lot bigger difference between AMD review samples and retail samples than 1-1.5%.”
Cite your source, please.
Here is one example note
Here is one example note firestrike extreme score for review is 7385 at Guru3d and only 7145 for a purchased Sapphire one. The difference is 3.36%. Over double. I’m sure I can find more but these were the few that used a replicable benchmark. Lots of users complained in forums about not getting the same performance as well.
http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,25.html
http://www.babeltechreviews.com/the-sapphire-fury-x-vs-the-evga-gtx-980-ti-sc-showdown-2/view-all/
Those two scores are not
Those two scores are not directly comparable, and don’t indicate anything like you’re suggesting.
Guru3d used an X99 platform with an 8-core i7-5960X and 4x4GB DDR4-2133.
Babeltech used a Z97 platform with a 4-core i7-4790K and 2x8GB DDR3-2133.
And by the way, that Guru3d
And by the way, that Guru3d score you’re referencing is not the graphics score, which is all GPU – it’s the overall score, which takes into account the physics score and, thus, is affected by the CPU/MoBo/RAM platform. Babeltech doesn’t say which score they used, but considering the difference in platform and the difference in score, I think it’s safe to guess they used the overall score as well.
Considering all of the variables in the comparison, that 3.36% difference is waaaay smaller than the margin of error.
BTW investigate whether Asus
BTW investigate whether Asus and MSI used over clocked defaults for AMD cards that they make as well. Let’s even up the playing field.
PC per has no interest in
PC per has no interest in showing AMD’s products in a bad light. TPU in the other hand showed that MSI also sent modified bios out for review on gaming brand cards. 270x had a 40 MHz bump from 1080 to 1120 a 3.7% performance bump. 290x was highlighted but listed same freq? 280x had 1020 to 1050 for 2.9% bump. 390x went from 1080 to 1100 for a 1.9% performance bump. Rounded to nearest tenth like on here. Interesting that all AMD cards were boosted higher but never reported on. There were also older Nvidia cards as well.
Good idea. While we’re at
Good idea. While we’re at it, let’s also investigate whether AMD does now or has ever pressured reviewers to use aftermarket, factory-overclocked AMD cards to compare to reference, stock-clocked Nvidia cards.
Let’s also investigate whether AMD cooperating with a game developer has ever resulted in seemingly-deliberate overuse of a feature in order to hobble their competitor’s performance.
Y’know, since we’re even-ing up the playing field and all.
This is a common practice of
This is a common practice of many websites to compare higher clocked AIB cards against reference. Many AMD cards lack a reference card and reviewers simply use whatever brand they have for reference. Do you view this as fair? And yes Oxide has probably hampered Nvidia in Ashes of the Singularity benchmark/game. Before the latest update Nvidia cards were outperforming AMD cards. When the benchmark first came out the Nvidia cards had poor performance as well. They apparently fixed their problems and Oxide “revised” their benchmark and lo and behold Nvidia was losing again. You can find all this online yourself if you care to. And yes AMD cards always had more compute which they used excessively in their sponsored games because the Nvidias were lacking. Just like Nvidias are better at tesselation except you can’t override compute like AMD has. Games works games have optional features and you don’t have to use them. Often AMD has better or comparable performance when these features are disabled. Witcher 3 comes to mind. FYI Nvidia experience also recommends disabling hairworks too. Tressfx hammered Nvidia cards and they still lose to AMD on that front. Yet most reviewers did the Tomb Raider benches with it enabled. Big surprise. Nvidia cards beat AMD cards with it disabled obviously. Reviewers don’t run benches with physx enabled because AMD has to use software version and it’s proprietary but run AMD proprietary tech in their games. Why the double standard? The Crysis 2 thing was exposed as bunk however the jersey barriers looked amazing with tesselation. We won’t go into details how Cryengine is now heavily AMD biased. As for Anandtech his site was heavily biased toward AMD so no surprise there. The 460 came in two versions 1 gig and 768? megabytes. The performance was so much better so no surprise Anand didn’t want to post results. BTW I had MSI Hawk version and it still wasn’t a good overclocker’s. Anand used medium benchmark settings to get rid of tesselation and showed AMD cards to be superior. Also 1080 resolution was no where to be found as Nvidia dominated here as well.
“This is a common practice of
“This is a common practice of many websites to compare higher clocked AIB cards against reference.”
When reviewing an AIB card, yes, compare it to its reference bretheren. When reviewing a brand new reference card that outperforms the competing reference card, slipping an overclocked AIB card into the results instead is skewing the results.
“Many AMD cards lack a reference card and reviewers simply use whatever brand they have for reference. Do you view this as fair?”
If there’s no reference version of the card, and the reviewer compares it to other non-reference cards, sure. If the reviewer is comparing that one non-reference card with a stack of reference cards and ignoring the AIB cards, then no, it’s not fair.
“Tressfx hammered Nvidia cards and they still lose to AMD on that front. Yet most reviewers did the Tomb Raider benches with it enabled. Big surprise. Nvidia cards beat AMD cards with it disabled obviously. ”
You must be talking about Tomb Raider 2013, where yes, TressFX threw Nvidia cards for loops. And boy, oh boy, were you Nvidia trolls just FROTHING at the mouth over it, refusing to accept, “Just turn it off,” as a solution. That argument might hold a little bit of merit if Nvidia hadn’t gotten the source code for TressFX (which was and is open source) and released an optimized driver like a week later – and then, oh, lookie, Nvidia cards started performing BETTER with TressFX on than AMD cards.
Used to be, you Nvidia trolls used that as another, “See, Nvidia’s better than AMD” argument. Now, you conveniently forget that, so that you can gripe about TressFX. And when an AMD fan points out that GameWorks features wreck performance on AMD systems, you offer up, “Just turn it off,” as a solution.
“Reviewers don’t run benches with physx enabled because AMD has to use software version and it’s proprietary but run AMD proprietary tech in their games. Why the double standard?”
Because AMD’s “proprietary” tech is open-source. Nvidia can optimize for it all they want. AMD doesn’t have that option as GameWorks really is proprietary.
“The Crysis 2 thing was exposed as bunk”
Cite your source, please.
“however the jersey barriers looked amazing with tesselation.”
And they looked exactly the same with x4 tessellation as they did with x64 tessellation. So either you’re lying, or your visual acuity is X-Men level.
So basically, wall-of-text proved nothing.
I am kinda sad.
When I found
I am kinda sad.
When I found pcper.com, I was happy to see articles with depth and now I am confronted with this biased. Monkeys crying about OC-Mode and 1.5% OC(which is probably not even considered OC) instead of reviewing the respected cards properly in FULL detail. Just stop it. Be more informative instead of throwing around with gossip crap.
Polaris – Dual BIOS
Polaris – Dual BIOS 1500MHz
GTX 1080 – Flickering when at desktop.
You are going to read about NEITHER in this site.
Just read about it in your
Just read about it in your comment. Why bot start your own tech site?
I think his point like many
I think his point like many others is this site covers new that’s is convenient for them not what’s being reported.
Its like a political outlet be it conservative or liberal. It only focus on news and rumors that reinforce their perspective on PCs.
Nvidia ‘s latest driver
Nvidia ‘s latest driver addresses the issue of flickering at high frame rate. Thanks Nvidia
Also in the past Nvidia said
Also in the past Nvidia said it fixed the hi power consumption problem at desktop when the monitor’s refresh rate is higher than 120Hz. From what I am reading people seems to still having problems. Those with 980Ti’s say that the problem was fixed. Some with Kepler and maxwell cards like GTX 970/GTX 780 are not that lucky and have to lower the refresh rate.
When we’re talking high, are
When we’re talking high, are we talking higher than the equivalent AMD video card which can consume as much as double the wattage of Nvidia cards. To quote AMD fanboys a little wattage doesn’t matter.
Is this high enough for
Is this high enough for you?
https://pcper.com/news/Graphics-Cards/Testing-GPU-Power-Draw-Increased-Refresh-Rates-using-ASUS-PG279Q
50 or 60 watts additional
50 or 60 watts additional will usually still be less than an AMD card. Most systems won’t be able too sustain 144hz or 165hz anyway. 120 Hz should be good enough. An extra 20% at 144hz or 37.5% maximum frame rate won’t make a huge difference. The card has to be ready to accept the increased frames thus the core clock spike. AMD freesync doesn’t do anything beyond 90 Hz anyway so no surprise they didn’t get a spike. It’s just like running an Nvidia card with a vanilla 144hz display. Gsync probably has a lot to do with the increased core speed.
“50 or 60 watts additional
“50 or 60 watts additional will usually still be less than an AMD card.”
Are you lying, or did you not even bother looking at the pretty pictures? The 980Ti system went from 73.7W @60Hz to 133.9W @144Hz. The R9 Fury system went from 69.9W @60Hz to 71.2W @144Hz. So no, that 50 or 60 watts additional is NOT less than the AMD card.
“AMD freesync doesn’t do anything beyond 90 Hz anyway”
1, that’s irrelevant, as Gsync/Freesync has nothing to do with the article discussing the frequency and power spikes of the 980Ti at 120+Hz refresh rates.
2, if Freesync doesn’t do anything beyond 90Hz, why are there monitors with Freesync ranges of 30-144Hz such as this one? Are they lying?
Comparing the amount of
Comparing the amount of wattage AMD cards consume vs Nvidia in the same product class 60 watts is nothing. Compare the wattage of AMD cards with a 970gtx. Looking at the charts you’ll see quite a difference between AMD and Nvidia.These charts don’t even include the 300 series of AMD which is even more power hungry.
https://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/25.html
You were arguing with JohnGR
You were arguing with JohnGR about Maxwell cards experiencing a spike in clock frequency and resultant power usage on the desktop at idle on 120+Hz monitors. Nothing in this response of yours has anything to do with that at all. You are trying to move the goalposts.
Sorry. I should have
Sorry. I should have specified total power draw when compared to an Nvidia card. I had thought it might be obvious but … Maybe it was a little vague.
Which is STILL not what John
Which is STILL not what John was talking about.
Yes but as Allyn pointed out
Yes but as Allyn pointed out you set your monitor to 120hz and it’s a non issue. It’s been mostly fixed so just trying to keep stirring it up is fanboyism. Yes it is the general efficiency of the Nvidia card that balances out this spike if averaged with gaming consumption. After all one does not buy an expensive video card for just looking at the pretty desktop. They are going to game with it “gasp”. If you were using a comparable AMD card that consumes more wattage during gaming and watching 1080p video and with it you still are better off with the Nvidia. I notice you and all other fanboys don’t address the extreme power consumption viewing a Bluray or internet 1080 video on AMD card while browsing you’ll be right up with the Nvidia there. So pointing out a flaw with consumption spike is the same except Nvidia’s problem was easily fixed and update did in fact fix it for some but yours hasn’t been fixed and wattage has been being wasted for years. A lame attempt at smearing Nvidia was John’s intent as well as implying the media covers up Nvidia’s problems but won’t comment on 1500mhz Polaris rumor. They can’t NDA from AMD is in effect until 29th regarding Polaris or when they release as this date may be presumed. He knew that or should have being an AMD fanboy.
“After all one does not buy
“After all one does not buy an expensive video card for just looking at the pretty desktop. They are going to game with it “gasp”.”
Unless you are one of the very rare few who turns on your computer, immediately starts your game, and as soon as you’re done gaming, shut your computer off again, then chances are pretty good you’re one of the millions upon millions of people whose computers spend more time in Windows, on the desktop or on the web or whatever, than in games, by a large margin. And considering how much time you spend on the PCPer comments, I’m guessing you spend more time not gaming than you spend gaming. And considering the fact that most people’s computers spend vastly more time not gaming than gaming, that extra 60 watts adds up. Fast.
And yes, setting your monitor to 120Hz mostly solves the issue. But unless you’re out on tech sites reading about it (and believe it or not, most aren’t) you’d never know about it, and you’d go on rocking double the power consumption and thinking everything is okay.
John’s point was that, despite Nvidia claiming that they fixed it in drivers, there are still tons of people who are apparently still experiencing the problem. Nothing you’ve said here – NOTHING – addresses that. Everything you’ve said here has been either deflecting to another topic, or giving Nvidia a pass on something that you’d be putting AMD into the stocks over.
“I notice you and all other fanboys”
Psst – I’m (currently) an Nvidia user – two 980 in SLI. I’m just sick of weak fanboy troll assholes like you coming across as representative of Nvidia users in general.
New article about this
New article about this problem from Tech Report today, but with a two monitor setup. The problem can be seen even on Pascal cards which is bad news.
http://techreport.com/news/30304/nvidia-pascal-cards-still-exhibit-high-refresh-rate-power-bug
You are also right about people not knowing about it. I had posted the problem months ago in a Greek forum about computers starting a new thread about it. Someone would have thought that most people spending time in that forum would have seen it, but after reposting it a few days ago in the thread about the new Pascal cards, there where people with GTX 980Tis and even Kepler cards that have missed that old thread, realizing only now that their cards where running at higher frequencies than they should, when the system was sitting idle at the desktop.
It doesn’t affect monitors at
It doesn’t affect monitors at 120 Hz only when monitor is set to 144 or higher settings. You are implying it affects 120 hz which is not the case you should have said over 120 hz not 120+hz implies 120 and up.
Freesync
Freesync links.
http://techreport.com/news/28199/asus-144hz-mg279q-monitor-may-top-out-at-90hz-with-freesync
https://pcper.com/news/Displays/ASUS-MG279Q-144-Hz-Display-Caps-90-Hz-FreeSync
And you are a reader of PC Per for shame.
Good for you, you picked out
Good for you, you picked out one monitor that has a 90Hz top-end for the Freesync range.
Now watch as I pick out a bunch that go all the way up to 144Hz on the Freesync range:
Acer XG270HU 27″ TN 2560×1440 40-144Hz
Acer XZ321Q 32″ VA 1920×1080 48-144Hz via DisplayPort
Acer XF270HU 27″ TN 2560×1440 40-144Hz
Acer XF240H 24″ TN 1920×1080 48-144Hz
Acer XF270H 27″ IPS 1920×1080 48-144Hz
AOC G2460PF 23.6″ TN 1920×1080 35-120Hz
AOC G2770PGF 27″ TN 1920×1080 48-144Hz
ASUS MG278Q 27″ TN 2560×1440 40-144Hz
BenQ XL2730Z 27″ TN 2560×1440 40-144Hz
IZO FS2735 27″ IPS 2560×1440 56-144Hz or 35-90Hz
Iiyama GB2488HSU 24″ TN 1920×1080 35-120Hz
Iiyama GB2788HS 27″ TN 1920×1080 45-144Hz
Nixeus NX-VUE24 24″ TN 1920×1080 30-144Hz
Viewsonic XG2401 24″ TN 1920×1080 48-144Hz
Viewsonic XG2701 27″ TN 1920×1080 30-144Hz
Source
I went to your AMD site and
I went to your AMD site and in fact nothing is mentioned about going up to 144hz but the chart shows only up to 90hz for what freesync helps with judder. You proved my point from AMD’s own site. Thanks bro.
Well, if there’s one thing
Well, if there’s one thing you’re good at, it’s misrepresenting data to claim it says something it doesn’t. That chart talks about Low Framerate Compensation, and talks about how Freesync handles framerates at 30Hz and lower. It has nothing to do with the top of the Freesync range.
But notice this part, right next to the chart:*Diagram for illustrative purposes only based on expected results on Asus MG279Q with AMD FreeSync™ technology. ”
“
You’re stuck on that one monitor again.
So, once again, you’re making a blanket assertion about the whole of Freesync based on the capabilities of one specific monitor.
Now here, I’ll help you out, since you seem to have trouble finding information that you don’t want to find, for fear of having to admit that you’re wrong (not that I would expect you to, of course.) Go back to that page and scroll down and click here:
http://i.imgur.com/soMASQa.jpg
Now marvel at the number of monitors with FreeSync ranges well above 90Hz.
“Well, if there’s one thing
“Well, if there’s one thing you’re good at, it’s misrepresenting data to claim it says something it doesn’t.”
Some would call that “lying”. lolol
Yes it may be only one
Yes it may be only one monitor but freesync relies on adaptive vsync above 90hz because it’s not doing anything anyway. Just allows you to basically use it like freesync isn’t there. I could do that if I had a 144hz monitor as well with or without gsync or freesync.
You really, truly have no
You really, truly have no idea what you’re talking about.
uhmmm… you do realize
uhmmm… you do realize adaptive synch is just a marketting term for freesynch right?…..
I am amused, however, that
I am amused, however, that you cited sources that are over a year old, and talk about one very specific model of monitor, as your proof for your blanket statement about all that is Freesync.
It’s almost as if you either A, knew that any newer source, or any source that covered a different monitor, would prove you wrong, or B, genuinely had no idea because you only accept information from people who are as dedicated to their fanboying as you are, and so thanks to your iron-clad confirmation bias, have only ever seen those two articles.
Or, I suppose, there’s C, you
Or, I suppose, there’s C, you knew perfectly well that what you were saying was an outright lie and you said it anyway, hoping that people would just accept it as fact. But you wouldn’t do that, would you?
Oh, and by the way – the
Oh, and by the way – the PCPer article you linked directly contradicts you.
“On the positive, that 35 Hz lower limit would be the best we have seen on any FreeSync monitor to date. And while the 90 Hz upper limit isn’t awful (considering we have seen both 75 Hz and 144 Hz limits on current monitors), it does the beg the question as to why it would be LOWER than the 144 Hz quoted maximum overall refresh rate of the display.”
So, you saw an article about how strange it is that this one specific 144Hz monitor has a Freesync range that only goes up to 90Hz, and specifically says that there are other monitors that have Freesync ranges up to 144Hz, and somehow concluded that “Freesync only works up to 90Hz.”
Tell me again – why should anyone take anything you say seriously, when you’re this wrong, this often?
Going up to 90hz was a common
Going up to 90hz was a common limitation of AMD freesync monitors. I obviously wasn’t in the market for one and I’m not a tech reviewer so I’m not up to date if after a while AMD fixed their problem. Good for them if they got it to work. Still won’t be as good an experience as Nvidia as numerous sites said that Gsync is better at the extremes than AMD’s sweet spot is 40-90 Hz where they are comparable to gsync. I bought a 60hz 4k panel recently which I viewed as more worthwhile than either technology. I actually considered a freesync 4k monitor but it consumed more wattage than the one I choose presumably from the extra hardware needed for Vesa standard of freesync.
Were these “numerous sites”
Were these “numerous sites” all using the Asus MG279Q?
“the extra hardware needed
“the extra hardware needed for Vesa standard of freesync.”
No no. You’re thinking of the Gsync module, which is an actual extra ASIC added to an otherwise vanilla monitor, and for which Nvidia charges monitor manufacturers up to $200 per unit – which is why Gsync monitors tend to cost so much more than their closest Freesync-equipped competitor.
Freesync (which is built on VESA’s Adaptive Sync standard) does not require a dedicated hardware block inside the monitor, it simply requires a compatible scalar, and VESA Adaptive Sync compatible scalars consume the same wattage (and cost the same per unit) as the scalars found in non-Adaptive Sync monitors.
Apparently you should cite
Apparently you should cite sources where it is working because it is just capable of the limited range at least up until 6 months ago.
https://www.reddit.com/r/Amd/comments/3vx8ro/a_question_about_freesync/
https://www.reddit.com/r/Amd/comments/3zrj4z/how_does_freesync_work/
Apparently there may be a hack that can get it to operate above 90hz. Don’t know, don’t care so please cite even one source that proves your point. I don’t have time to look myself when I have to respond to everyone here who can’t use the internet to find even simple things.
Hey, guess what? Both of
Hey, guess what? Both of those links? They’re talking about that one same monitor. Again.
“I have a mg279q and a gtx 760. Caps at 60, but freesync is disabled no shit. Do I get more hertz if I buy a lets say, fury x? (Buying this soon)”
“I have been using the Asus MG279Q 144hz Freesync monitor for a few months now”
Why don’t you find me one source that proves YOUR point – that Freesync doesn’t do anything over 90Hz – and that DOESN’T depend solely on that one monitor? I’ve used the internet to find “simple things” and they’ve all been proving you wrong.
Guess the AMD fanboy in you
Guess the AMD fanboy in you isn’t concerned about the power difference viewing 1080p playback on your cards vs an Nvidia. Techpowerup does the testing for each new card and the wattage can be drastic 60+ watts as well. I have a 60hz 4k display so when next I upgrade, it will be a non issue for me.
I guess the 120 extra watts
I guess the 120 extra watts the 390x uses in as*es of the sh*tularity mean nothing to you as well. OCan Nvidia to get the 20% max from asynchronous frame rate increase and it won’t cost this much. Have a look yourself
http://www.tomshardware.com/reviews/ashes-of-the-singularity-beta-async-compute-multi-adapter-power-consumption,4479-5.html
WOW, the damage control is
WOW, the damage control is strong in this one.
Notice how he ham-handedly
Notice how he ham-handedly renamed the game to profanities so as to express his contempt for something that Nvidia doesn’t win at, and to express his disgust in the idea that a game developer had the nerve to work with AMD to take advantage of features that Nvidia cards don’t have?
I look forward to seeing him express similar disgust with every GameWorks (oops, I mean ShameWorks – see what I did there?) game that doesn’t perform as well on AMD cards because the game developer had the nerve to work with Nvidia to take advantage of features that AMD cards don’t have.
Oh, wait. My mistake. We’ll never see him do that.
Wow I always thought PC Per
Wow I always thought PC Per was AMD infested, now I know for sure. Get this Gameworks features are totally optional at least in the few I have. I know it’s novel but you can just switch them off. But most run adequately on AMD cards as well. Same can’t be said about AMD optimizations. Yours is rather tame compared to what other fanboys call it. Doesn’t really bother me that most of you are ignorant/jealous regarding Gameworks
And you can turn Async
And you can turn Async Compute off in Ashes of the Singularity, which is what you were bitching about.
Now try turning off Nvidia’s Godrays and Ambient Occlusion libraries in Fallout 4 in the game settings. Hint – you can’t. Not without doing some .ini editing. The settings menu shows the option for “Off” but it doesn’t actually turn it off.
Once again, what a surprise, you pillory AMD for something that you applaud Nvidia for.
When Maxwell released:
NVidia
When Maxwell released:
NVidia Fans: Oh wow! Maxwell is so much faster than AMD’s cards! AND it’s soooo power efficient! Take that, AMD fans, with your power-hungry furnace cards!
AMD Fans: Hey, here are a few instances where the AMD card is faster than the Nvidia card!
NVidia fans: Yeah, but look how much more POWER it uses to do it! You still lose because Maxwell is so power efficient!
AMD fans: Hey, AMD is planning to make their next generation cards a whole lot more power efficient!
Nvidia fans: Whoopee doo! Maybe they’ll get somewhat close to Maxwell’s power efficiency!
AMD fans: Hey, that supposedly-super-power-efficient Maxwell suddenly uses a whole crap ton more power at idle when powering a high-refresh-rate monitor!
NVidia fans: Power efficiency doesn’t matter, only AMD fans think it matters – let’s pretend we haven’t been screaming about Maxwell’s power efficiency for the last year, so I can feel like I’m winning this argument, okay?
Oh, I forgot one:
AMD fans:
Oh, I forgot one:
AMD fans: WTF do you mean, power efficiency doesn’t matter? You’ve been howling about power efficiency for almost 2 years! You’ve been howling about it since September 2014 when the 970 and 980 launched, and you’ve been all about power efficiency ever since the 1070 and 1080 were unveiled, but now power efficiency doesn’t matter?
Nvidia fans: What? We’ve never ‘howled’ about power efficiency! I’ve never seen an Nvidia fan talking about power efficiency.
AMD fans: (Provides tons and tons of screenshots and links of Nvidia fans touting Maxwell’s and Pascal’s power efficiency)
Nvidia fans: (Disappears from the thread and never answers, later spotted in other comment threads either touting Pascal’s power efficiency, or mocking any AMD fan who mentions power efficiency)
I think it’s kind of silly to
I think it’s kind of silly to buy an AMD card because of price and pay more in electricity that should be considered in total cost to own. But to each his own. My brother is an AMD fanboy like you guys are. You simply cannot reason with most. BTW Polaris is 30% more efficient at displaying 1080p video than its ancestors. WOW. AMD finally addressed it after years. Still going to be higher than comparable Nvidia cards though.
Really? You’re going to argue
Really? You’re going to argue that the increase in your electric bill more than makes up for the lower cost of the AMD card?
Sure, in 18-20 years.
I have no idea what your
I have no idea what your point is. It sounds like you are upset that PCPer isn’t covering every single little news bit.
I am upset when the news look
I am upset when the news look one sided. You will not see an article about Nvidia’s latest problems. You will see only references to those problems when a new drivers is released and it promises to fix those problems. Of course if the problems are not fixable with a driver, you are not going to learn about them.
On the other hand you will read the rumors about possible problems with AMD’s hardware, like Polaris not being able to hit 850MHz stable. No matter how ridiculous those rumors look like, you will read about them. But you are not going to read anything about news that look positive for AMD. There are leaks out there with scores from RX480, RX470, RX460. There are leaks with photos of the cards. There are leaks about cards coming out with factory speeds over 1300MHz(can’t pass 850MHz, remember?) and rumors about dual BIOS with the second BIOS giving the option to have speeds at over 1500MHz. Have you seen those rumors/leaks anywhere? No. Why? Because at 1500MHz RX480 is very close to a default GTX 1070 and at 1600MHz it beats a default GTX 1070. And while GTX 1070 can also overclock, people will avoid giving 50% more to almost double money for just what they can get more from an overclock.
In the end. Anything that can spoil Nvidia’s image is ignored. Anything that can create a better image for AMD is ignored.
Anything that can have a negative impact on GTX sales is ignored.
You won’t hear about Polaris
You won’t hear about Polaris until June 29th when AMD’s Non disclosure agreement is lifted because it will be available for sale. If it is a quality product, why wouldn’t you want people to know about it before it hits the shelves. Polaris is to put it in Raja Kodiak’s words a mainstream card and 1070 and 1080 are enthusiast level cards. What does that say about Polaris? For all we know about anything 850mhz could be Polaris and the 1267mhz or 1500mhz could be Vega that is about to launch. AMD always was big talkers and braggers when they had the goods. When Nvidia came out with g-sync, AMD came out and said they already had a software solution called freesync and demoed it on a laptop to steal sales/thunder from Nvidia. Nevermind the fact that it took around two years to get freesync product on shelves from something they already had. Trying to make Polaris something it is not is a fanboy’s wettest dream. You get what you pay for. And if people keep quoting AOTS numbers, I’ll ask you how it performs under DX11 or in Project Cars or dx12 Rise of the Tomb Raider. Biased is biased whether it’s our way of yours. DX 12 is a disaster that should be killed like DX10 was but that is a whole other animal. I actually feel sorry for anyone who buys a Polaris because I think you’re gonna get taken big time. I could be wrong but it isn’t looking likely.
Your posts are stupid. I look
Your posts are stupid. I look at them and wonder. ARE you 8 years old or do you have a brain damage.
raja… kodiak…..lol
You must be a brain damaged
You must be a brain damaged AMDolt that can’t read. Google autocorrect changed it and I didn’t notice. But thanks for showing your age.
What about the posts are
What about the posts are stupid? The truth. I have a college degree but what does that matter. Point to one false thing I posted. You can’t because trolls will be trolls. You may be sorry if you buy the RX 480 on June 29th. It may be good for the price but for the wattage used I don’t think it will. Nvidia ‘s are still more efficient while being at 16nm vs 14nm. How is AMD’s VR performance compared to Nvidia ‘s. Very few know because most tech sites didn’t bother to compare because the Nvidia’s are way better. Knowledge the more you know… Go back to school kid.
The major fact that you got
The major fact that you got flat wrong in your previous post is saying that Vega is about to launch. That is just flat out wrong. The RX 480 is a Polaris card, as are the 470 and 460. Vega cards won’t be out until later this year.
They moved the date up. 5
They moved the date up. 5 months or so isn’t too much longer to wait. Vega cards will be being tested if they are gonna be launched this year. Polaris probably already been tested and you have photos of them on the assembly line so yes these leaks could well be Vega. AMD has not even confirmed core clock. There however is a 900+ MHz Polaris for new PS4. Long way off from 1266 MHz.
Any of the pictures coming
Any of the pictures coming off production line right now are RX480, which are Polaris 10 chips. We know that these are clocked in the 1200MHz range, with rumor of them clocking as high as 1500MHz. These are absolutely not Vega chips. The 470, and 460 use Polaris 11, which are smaller, and could likely have the lower clock speeds. As far at the PS4 Neo, that is also a custom chip, likely based on Polaris, but clock speeds and exact specs will be unique to the PS4.
Polaris is coming right now. Vega is still a quite a ways away, 6 months is a long time in the tech world.
Do you really think a $200
Do you really think a $200 video card is going to clock at 1500 mhz when their Furyx enthusiast card maybe could reach what 1150-1200 mhz. A 380 core clock is 970 mhz. The base of Rx 480 is presumed to be 1266 mhz is 30% boost and 1500 mhz is 55% boost. Doesn’t seem too likely. The point I’m trying to make is that it’s all unconfirmed rumor. Vega is probably going to be Polaris 2.0 knowing AMD’s tendency to rebrand cards. I think they would keep the core clocks lower to save on power consumption. If they are clocking this card this high it means performance is probably lacking. I know that those are 480’s on the line. I’m saying leaked performance numbers could be Vega and not Polaris.
Everything you are saying is
Everything you are saying is complete lunacy, no basis in reality.
We just nVidia go from ~1100MHz clock speeds on the GTX 970 to ~1800MHz with the GTX 1070. Why would is be implausible for AMD to do something similar, especially considering they are both going through a similar change moving to a FinFET process.
Vega also won’t just be a rebranded Polaris. They are being manufactured with the same 14nm FinFET process but Vega is going to be a larger chip, more compute units, and likely feature HBM2 memory.
Also, your presumption that they need to keep clocks low to save on power consumption is dumb. The GTX 1080 uses less power than the 980Ti or the Titan X and is clocked WAY higher than those cards. Clock speed != power consumption.
Over clock the core on a
Over clock the core on a video card and see if your consumption goes up. The Furyx last 10% of performance comes at cost of 100 watts of power. There went your overclocker’s dream. This is common knowledge not lunacy. AMD had to over clock the sh*t out of it to even beat a pathetic 980gtx. Architecture efficiency, power leakage and other factors play in too but generally lower clocks= lower wattage consumed. Nvidia cards have a more efficient architecture and the 16nm TSMC node allows their Pascal have 65% higher speed and 70% less power used than 28nm. 14nm from Samsung is only 50% more performance and 60% less power consumption. How do you get 2.5 times the energy efficiency (performance per watt) out of 60% and 50 %. That’s your question for the day. I’ll give you a hint lower clocks play a part. Under AMD roadmap for Vega the only thing listed is HBM2. It’s not a stretch that they’ll use the same basic architecture and just make the die area bigger for all the shaders and compute units and such. They are on GCN revision 4 are they not? I guess I wouldn’t know because I’m crazy. LOL
Yes, I understand that
Yes, I understand that increasing the clock speed will increase the power draw. I should have clarified. My point being that the clockspeed isn’t the only thing, or even the most important thing, when determining power draw.
New architecture and manufacturing process is going to have a much greater effect on the power required. This is obvious. There is also the issue of different cards running more or fewer compute units. Say a Polaris 11 chip has half the compute units as a Polaris 10 chip, even if they were running at the same clock the chip with fewer cores is going to consume less power.
The primary point being that if you are comparing two completely different chips you can’t just automatically assume the one clocked 50% is going to consume more power.
NamelessTed. I’m sorry I
NamelessTed. I’m sorry I guess it could be clocked at 1266mhz maybe the 850mhz was laptop chip or early Neo chip. 1500 still seems like a stretch given that it can only draw 150 watts max from it’s connector. From rumors these cards run hot. Hope it has enough heat dissipation for overclocking overhead. Also rumor has it partners weren’t happy with $199 retail. They might still wherever they can to make a few dollars more.
The GTX 1080 only has a
The GTX 1080 only has a single 8-pin, has a TDP 180W, and people have them running over 2000MHz. Shit, the 1070 has a 150W TDP and ships at 1500Mhz with automatic boosts going over 1600MHz.
In regards to the 1500MHz, that rumor was also tied to a speculative $300 price tag, and presumable a custom board.
I realize that the AMD cards are different, but it isn’t a stretch to think that if nVidia is releasing cards with that kind of power efficiency, why is it so insane to think that AMD can’t even come withing 50% of that?
Yes. I see now there were
Yes. I see now there were good improvements to AMD architecture. Polaris now has boost clocks and memory compression just like Nvidia cards do. It’s just everyone else comes off as combative here. You’re pretty reasonable Ted not sure if you’re AMD or not but doesn’t matter. AMD cards are superior to Nvidia cards in some ways but I prefer Nvidia because I like their features and efficiency. I’ve owned ATI cards in the past but was hooked when I bought my 9800+ gtx. I’m in the market for a new Nvidia. I hope AMD provides some competition so that I can replace my EVGA FTW dual bios 4 gig 760 gtx at a decent price and can drive my 4k monitor better.
So you’re basing your
So you’re basing your anti-AMD argument on rumors? Are rumors about the RX-480 only acceptable when the rumor looks bad for AMD? Because way down thread you refused to accept a rumor that looked good for AMD.
Oh, wait, that’s right. You’re a hypocrite. So of COURSE rumors are okay when the rumor confirms your bias.
So, you’re making up your own
So, you’re making up your own rumor, to combat a rumor that you don’t like, while simultaneously saying, “It’s a rumor, you can’t believe it.”
Seems legit.
Again if we were following
Again if we were following rumors Pascal wouldn’t even be out yet. It according to AMD fanboys was supposed to be out at the end of this year. I guess you all drink the crazy juice. Expecting 2 x $239 rx 480 to beat a $700-800 enthusiast level 1080. That’s rich. Only in AMD sponsored benchmarks and games will that anomaly even exist. In Nvidia games and vendor agnostic games your pathetic 480’s will be left in the dust. The Ashes bench they used was apparently CPU locked and worthless. Along with not knowing exactly what settings were used etc. So buy your Rx 480 with no reviews. That’s lunacy. I know most AMD fanboys are looking to upgrades their hd 4000 series cards. This would be perfect for them.
“Again if we were following
“Again if we were following rumors Pascal wouldn’t even be out yet. It according to AMD fanboys was supposed to be out at the end of this year.”
You’re absolutely right – before Nvidia started their PR blitz hinting at the Pascal reveal, all the rumors suggested that it would release late this year.
There’s a simple explanation for this, one that also would explain the abysmal supply of cards and the ridiculously high prices that result – an explanation that Nvidia will never, ever admit to, and that most rational people in the tech world acknowledge as a genuine possibility (and, thus, I expect, you will dismiss offhand.)
Nvidia pushed the launch of Pascal forward – drastically – to get out ahead of AMD’s Polaris.
They were nowhere near ready to supply the retail outlets with cards to meet the demand. The CEO, called upon to get up on stage and hype the thing, didn’t even know enough about his own product to give accurate information. (“And it only runs at 67°C!”) The whole presentation was a gigantic ball of awkward-and-unprepared. It was cringe-worthy.
They weren’t ready. But they pushed the launch up so they could get to market first, and had juuuuuuust enough cards out to retail outlets to be able to say, “See, it’s not a paper launch!”
Can I cite a source to prove it? No. Nobody wants to talk about that. But it does offer a more-than-plausible explanation for everything about the Pascal launch up to today.
you have a college degree in
you have a college degree in stupidity. i haven’t read some much bs in my life. your ignorance combined with your fanboism is better than comedy.
You’ll see I’m right on the
You’ll see I’m right on the 29th AMD a**clown.
AMD fanboys use all the
AMD fanboys use all the excuses in the world. I’ve read so much BS from them such as I still get 60fps with my ancient HD (insert model number of your choice). No need to upgrade yet. Well you fanboys are the reason AMD is in such dire straits. Buy a product once in a while instead of bragging. I got a 3% performance bump from the latest AMD drivers. Seriously. I understand some people can only afford what they can. Usually AMD is the better choice when it comes to price but usually that means you are losing out in other areas or some concessions had to be made. Expecting a boatload of performance out of a mainstream card is just going to be a big letdown for you.
Pity your Nvidia card can
Pity your Nvidia card can only increase your e-penis and not your IQ. You could be enjoying double digit IQ if your Nvidia card could help. Unfortunately your e-penis inches are more that your single digit IQ points.
LOL if I was as dumb as you
LOL if I was as dumb as you imply single digit I wouldn’t be aware of anything that is going on much less type a coherent response to your pathetic Nvidia smear attempts. You sir are the one with low intelligence who has to resort to insults. I thus prove I am smarter than you for buying Nvidia and Intel processors and not AMD. They are the better overall value and performance leaders. Price doesn’t enter into the equation. Less money wasted is still money wasted. An insult would have been to set my IQ at 60 which is a step above intellectually challenged which is what I now think you may be. Mentioning penis size who does that really. Apparently ignoramuses do. LOL
“I thus prove I am smarter
“I thus prove I am smarter than you for buying Nvidia and Intel processors and not AMD.”
Sorry to inform you that the Intel processor, like the Nvidia card, does not have any positive impact on your IQ.
“AMD fanboys use all the
“AMD fanboys use all the excuses in the world. I’ve read so much BS from them such as I still get 60fps with my ancient HD (insert model number of your choice). No need to upgrade yet. Well you fanboys are the reason AMD is in such dire straits. Buy a product once in a while instead of bragging.”
This coming from someone trying to power a 4k monitor with a GTX 760.
Wow.
“Point to one false thing I
“Point to one false thing I posted.”
Do you still want to stand by this? Because I’ll point to a bunch.
I feel like you are just
I feel like you are just cherry picking two specific issues that are recent. PCPer ran a horrible rumor with Polaris only hitting 850MHz, I don’t know what the deal with that was. But, in general I just don’t see the same trend as you.
PCPer tested nVidia power draw issues on high refresh rate monitors. They also put up a post about a new driver fix for some screen flickering with the 1080. Ryan even criticized Tom Peterson when he was on recently talking about the 1080 in regards to Founder’s Edition and pricing and availability.
The great thing about the internet is that you can get news from as many different sources as you want. Any one site can’t cover every single detail about everything.
If you take things at their
If you take things at their face value, then what JohnGR said is valid. However, I don’t think that is what has happened.
I think the real reason why things are done the way they are is not because any particular hardware/software vendor is paying them off, by having the masses accuse them of being shills for x, y, or z company. But because what they are doing is generating traffic by feeding the masses what they want to see: controversy and negativity. It’s actually quite smart. by having people like you, JohnGR, chizow, hitman actua(what a knucklehead), arbiter, etc, the usuals who stir up comment section on articles. People come back to see more of it, entertainment in two forms: the article and the comments. The same reasons why they allow completely idiotic spam to be posted and left, why some offensive and inappropriate comments are left, etc… becuase it generates traffic.
Seriously, why do you think they have resorted to Patreon? It’s not for a lack of quality content, because they can and do some really good work. But it hasn’t been enough. So, how else can they get paid? Jeremy himself has said that those referenced comments means views, more views mean more money.
I’m not mad or really that surprised once I thought about it.
Let me show you something
Let me show you something about me, because you are putting me in a category that insults me
https://pcper.com/news/Graphics-Cards/Video-Perspective-Gaming-Overclocked-AMD-A10-7850K-APU#new
chizow, arbiter and others would never have asked from PCPer to stop promoting AMD products if they where in my place.
I want balanced information, not one sided. No matter which company is promoted I am NOT going to like it.
You better think again if you believe you know the others. You know much less than what you think.
PS Don’t be someone who reads the… avatar and not the text.
Whether or not you feel
Whether or not you feel insulted is all on you, it was an observation about what you and others have posted.
That set of people I listed? These are people that are well known to say things to stir up the comment section. Just because your objectives may or may not be the same as the others is irrelevant, you all still say things to stir up comment section. I posted an observation about what you and others have done many times.
And this ties in with the point I was making to the other guy, which is what PCPer allows to go on in its comment sections – Spam, borderline NSFW, members attacking one another, off topic, etc, all that stuff generates traffic/views, including the comments left by people who stir up the comment section which includes you!
Even as an observer you are
Even as an observer you are wrong.
I only post in about 1 out of 5 or 10 articles in here and usually only 1, 2 maybe 3 posts. In rare occasions I will engage in a dialog or whatever that will lead me to make more posts. But having an avatar makes you notice my posts and gives you the wrong image of me being a frequent poster. Just check the last 30 articles in here, and try to find in how many of those I have posted and in how many I have done more than 2 posts.
As for why PCPer allows posts that annoy you in the comment section? Well I wonder why it lets you posting in the comment section. We don’t have to read your arrogant analysis about others.
Has AMD got their corrupted
Has AMD got their corrupted textures on desktop fixed yet? Only took them what a year or two. LOL
Hey, I bet you can guess what
Hey, I bet you can guess what I’m about to ask you to do….
…cite your source, please.
Sorry the typo is supposed to
Sorry the typo is supposed to be Raja Koduri not Kodiak. I hate stupid Google autocorrect. It makes a lot of mistakes if you don’t check. I never make fun of what people type for this reason and it’s childish to be a grammar cop. As long as you get the point everything’s OK. I don’t know if there is an edit function here because I’m new. If there is I’d appreciate if someone could point me in the right direction.
If you’re registered and
If you’re registered and verified, you can edit.
Then again, I’m pretty sure if you’re registered and verified, you can’t change your name to something else and avoid being called out on your lies. So, you’ll probably avoid that option.