Early in April ASUS and AMD announced that the MG279Q display, first shown at CES in January, would be brought into the world of FreeSync and officially adopt AMD's branding. The original post from the AMD Twitter account clearly mentions the display would support 144 Hz refresh rates, an increase from the 120 Hz that ASUS claimed during CES.
Now however, we have some complications to deal with. According to a FAQ posted on the ASUS.com website, FreeSync variable refresh rates will only be supported in a range of 35 – 90 Hz.
Enable FreeSync™ in the MG279’s OSD setting, choose PC’s refresh rate timing between 35-90Hz (DP/miniDP only)
On the positive, that 35 Hz lower limit would be the best we have seen on any FreeSync monitor to date. And while the 90 Hz upper limit isn't awful (considering we have seen both 75 Hz and 144 Hz limits on current monitors), it does the beg the question as to why it would be LOWER than the 144 Hz quoted maximum overall refresh rate of the display.
The ASUS MG279Q is an IPS-style display so the quality of the screen should be top notch, but that doesn't alone answer why the upper FreeSync limit and upper refresh rate would not match. We already have the Acer Predator XB270HU G-Sync display in-house that operates at a variable refresh rate as high as 144 Hz with a similar quality IPS display. I've inquired to both AMD and ASUS about the reasoning for this 90 Hz limit, and we'll see if either side cares to comment prior to the display's release.
AMD defense force out in full
AMD defense force out in full force doing damage control for this half-baked garbage, really pathetic.
Holy fucking shit, PCper has
Holy fucking shit, PCper has turned into nothing but a huge ass fucking bitch fest…
I think every comment here should just be delete, Fanboys should just go fucking kill themselvies since they do nothing but hurt tech companies as a hole.
So that’s my final comment for this website.. An i’ll never come back to this bitch fest place again.. So fuck you pcper an the fanboys around the world.
I’d say just don’t read
I'd say just don't read comments on AMD / Nvidia articles 🙂
You guys must have a pretty
You guys must have a pretty sweet deal going! Paid by *both* Nvidia and AMD! When do you think one of these companies will catch on to this? 😛
if you have such a problem
if you have such a problem why are you even here
Hopefully never!!
Lol
Hopefully never!!
Lol
I’m onto you.
I’m onto you.
The entire
The entire technology/enthusiast web site sphere is under the influence of their advertisers, as well as those companies that provide review samples. Just try and find a site that only accepts ads for products that are unrelated to the products that the site reviews. You take every review with a grain of salt, and maybe, just maybe, in the future someone will create a review site that will only accept ads from products unrelated to what the website reviews. You can bet though that that site will very likely have to charge for its content, because the review samples will have to be randomly purchased from a third party retail source! Those suppliers of products expect some back scratching for the review samples, least the reviewer find the samples arriving a little later than the sites that have posted more favorable reviews in the past, ad revenue works the same way.
In reality only a review site that accepted no ad revenues EVER from any of the makers of the products that that site reviewed, and who purchased their review samples at random from the open market, to avoid manufacturer cherry picking, or conflict of interests from accepting review samples from manufacturers, would be above all suspicion. The site would have to charge for the reviews, and only accept non related advertising permanently, so no ads from any makers of the products reviewed, EVER. It can be done but it’s going to be costly unless enough people would subscribe to the online publication, those laundry soap ads would not produce enough revenues and technology products are costly.
Haha, and I’m onto you!
Haha, and I’m onto you! Please don’t steal him, we need him out there doing the dirty work other sites don’t seem to be interested in doing anymore…
fk u dumb fk for not
fk u dumb fk for not understanding the issue
I never got why people get so
I never got why people get so hell bent when people talk down about a company are you that fucking insecure.
is has nothing to do with
is has nothing to do with fanboy-ism. the problem is you fav reviewer covering up nvidias sh*t and then making special videos dedicated to amds issues.
AMD lied, pixels died!
DP
AMD lied, pixels died!
DP 1.2a can't melt steel beams!
MG279Q was an inside job!
DP 1.2a is a VESA standard,
DP 1.2a is a VESA standard, AMD’s marketing should on having the VESA standards organization adopt “freesync” into the standard been told by VESA to step away from the double branding, and any certification responsibilities. The branding authority, as well as any DP 1.2a certification responsibilities rests with the VESA standards organization. VESA should have not allowed the freesync branding the nanosecond the DP 1.2a standard was ratified.
Shame on you VESA, as an industry standards organization.
And eternal shame on the marketing “profession”, including AMD’s marketing department. Nvidia Your marketing department is just as dirty, as are the entire world’s marketing departments. Marketing traces it roots to the snake oil salesman, and the original lie, whenever it was first uttered. Anyone whoever believes a single word any marketing entity ever utters, is the dictionary definition of a monobrowed, high foreheaded, slackjaw.
Good grief,I really don’t
Good grief,I really don’t care about the NVIDIA/AMD ass hattery. I can’t even find out if anyone knows why the cap is 90hz because there are 99 comments about nothing.
It isn’t a secret that Ryan and Allyn are biased towards Nvidia…The truth is they still produce great podcasts and great articles. PCper spearheaded freesync and we had info about it from them first regardless of whether it was weighted green.
Can’t we just all get along?
Really any time there are two
Really any time there are two competing technologies we are going to appear biased towards the one we see as the superior tech. If AMD had (IMO) the better hardware / technology, the situation would be reversed.
all we want is sweeping this
all we want is sweeping this under the rug as well as you did the nvidia 3.5gig memory gate. just say games today dont really go over 90 fps and all will be well
hey Allyn. Remember how
hey Allyn. Remember how covered up the rog swift low fps issue? because you own that panel? do the same for this panel! just say if you blink over 90fps while its tearing you wont be able to notice it
By ‘covering up’ you mean
By 'covering up' you mean 'the only site to look into the issue and report on it'? What an amazing job I did of hiding that issue by writing about it so everyone else knew about it.
no i mean make a separate 20
no i mean make a separate 20 min video on it with an oscilloscope and sh*t
We used a digital scope in
We used a digital scope in that other post. Also, there was no need for a 20 minute video on that issue because there was not nearly as much misinformation and FUD being spread by commenters on that issue.
ok…fair enough.
but pls try
ok…fair enough.
but pls try and see my side of things. i have more then enough reasons to say that you are bashing on amd and covering up for nvidia and thats not a good situation for you to be in. and that coming from a pcper fan should mean a lot to you. i like getting my tech news and reviews from pcper and i would like it to stay that way.
to PCPER crew:
none of these
to PCPER crew:
none of these comments have anything to do with amd of nvidia. these are all pcper fanboys who love your content and hate the biased approach you are taking now.
thanks
Okay, this one got me to
Okay, this one got me to chuckle :).
Be careful not to choke on it
Be careful not to choke on it
Like how you are choking now?
Like how you are choking now? How about becoming a legitimate member and verifying your account. That’s what I thought…anonymous…pfft.
So… wait… You are saying
So… wait… You are saying I should make sure my next mower has a Honda engine?
It is amazing how how much
It is amazing how how much trolling and FUD there is in any thread concerning AMD. It is obvious to me that AMDs implementation has issues and they can not all be blamed on the display or scaler makers. Free sync does not currently implement any frame multiplication the way gsync does, so the lower bound on the VRR window is equal to the panels minimum refresh rate. This makes a free sync panel more difficult to get working properly since to support lower refresh rates, you have to push the minimum refresh rate of the panel down. This probably isn’t that easy to do without causing issues; I don’t know if this 90 Hz issue is caused by this. Does pushing the minimum down cause trade offs that limits the maximimum?
With gsync, you can just set a minimum and depend on the gsync module to keep the panel refresh in range by frame multiplication, independent of how low the video card supplied refresh rate drops. Going below 24 fpsfron the video card is somewhat ridiculous though, since it will turn into a slide show; you will lose any illusion of smooth motion. I don’t know if supporting a low panel refresh rate of ~20 Hz is doable while keeping the maximum high with how free sync is currently implemented. They are going to need to implement some kind of frame multiplication, whether it is on the display side, like gsync, or on the gpu side (send the same frame multiple times).
While gsync seems to be the better solution currently, this may change. I am in favor of an open standard rather than a proprietary solution that ties you to a specific graphics card vendor. I keep my display a lot longer than I keep a video card. my general advice when buying computer hardware is to get a good display, since this is the component you will generally keep the longest, and it is what you look at all of the time. The rest of the components go out of date quickly, or they did in the past. Bottom line: refresh rate, whether variable or otherwise, should be part of the display standard. Although, I also don’t see anything stopping display makers from implementing their own versions of VRR on top of free sync; couldn’t they just add some memory to their scaler and implement frame multiplication themselves?
I agree, and that’s the
I agree, and that’s the biggest reason I bother posting in these threads, because the trolling and FUD is directly attributed to misleading statements AMD has made along the way in an effort to disparage, undermine, and slow adoption of the competition’s solution G-Sync.
Simply put, if AMD did not make so many misleading statements regarding G-Sync and their own FreeSync implementation before it was actually ready, you wouldn’t see all of this antagonistic back and forth and I wouldn’t even care to post, because frankly VRR is great tech that I do feel every gamer should have a chance to enjoy.
Now, it is a simple matter of holding AMD accountable and punishing them for making claims they clearly have not delivered upon. What is most interesting is that AMD fans and users who would directly benefit from any positive changes to FreeSync aren’t actually interested in acknowledging the problems, they seem to be more interested in sweeping them under the rug and attacking the credibility of the ones who identified the problems!
What was AMD supposed to do
What was AMD supposed to do when Nvidia tried to push a proprietary standard in an attempt at creating a vendor lock-in? If you buy an expensive gsync panel, are going to buy a different panel to switch vendors on the next gpu upgrade cycle? I wouldn’t buy into gsync now because there is some possibility that the next generation of AMD parts with HBM will leave the competition far behind. It may be possible to fix the low end refresh rate of free sync in soft aware, but if you have a brand new R9 390X (or whatever they call it), you may not be too concerned with frame rates under 35.
They should’ve just kept
They should’ve just kept their mouths shut until their own solution was ready, and they certainly shouldn’t have made all the misleading and downright dishonest statements they made along the way about FreeSync. But this is typical of AMD and they will continue to do so as long as their fans let them get away with it.
As for the rest, you can hope all you like that AMD blows Nvidia away with their next chip and HBM, in the meantime the overwhelming majority of the market has moved on to 970/980/Titan X Maxwell parts. Even if AMD reaches parity with Nvidia’s Maxwell parts in the upcoming months, those users will be looking onward to Pascal. Who wants to be a hope merchant 6-9 months behind the curve when they can just buy a solution that works, today, from Nvidia?
I just want to say “thank
I just want to say “thank you” for the continued coverage of adaptive sync tech. I don’t see articles like this as bashing AMD, but as letting early adopters know the possible draw backs of G-Sync vs Free Sync. If you have an AMD GPU, why wouldn’t you want to know the limitations of the current monitors that use Free Sync? The reason people visit a site like this is because they provide the most in depth coverage of these topics without glossing over issues.
Somebody summarize this
Somebody summarize this comments section for me…
Fanboys from either camp be
Fanboys from either camp be fanboyin’
more like pcper subscribers
more like pcper subscribers telling pcper to be fair in their reviews and not cover for nvidia when they fk up and then throw amd under the bus when they fk up.
Allyn and Ryan are like the
Allyn and Ryan are like the Buddhist Monks of Tech Journalism. Patience is strong with these two =P
you dont need patience when
you dont need patience when you counting that nvidia ad money
I thought they were paid AMD
I thought they were paid AMD shills?
We are only paid AMD shills
We are only paid AMD shills when the reviews / news favors AMD. 🙂
Well G-sync certainly had its
Well G-sync certainly had its share of growing pains. People tend to forget that it wasn’t the perfect solution that it is now when it launched.
That said, I’m willing to bet that this has more to do with the panel and scaler that ASUS is using than the technology itself. But until it gets tested, it remains to be seen. Even at 35-90hz, I’m still planning to buy one of these.
What growing pains are you
What growing pains are you referring to? I actually didn’t pay much attention to G-Sync until they had the non-DIY models for sale, but even still, everything I saw and read from users and reviewers was that even the DIY version on the Asus 24″ did everything it said it would from day 1 when Nvidia launched it at one of their gaming events.
Not really sure how that can be compared to the growing pains we are seeing with FreeSync, where the results and implementation are frankly all over the place.
There was at least one issue,
There was at least one issue, which we pointed out. Funny, nobody called me an AMD fanboy for that…
Hahah yeah incredibly ironic.
Hahah yeah incredibly ironic. Thanks for the link, I wasn’t paying that close attention with the Asus 24″ and DIY kit because I knew I wouldn’t go that route (downgrade in many respects coming from a VG278H), but having owned a Swift, it looks like they fixed that 35-40Hz band issue at some point prior to the Swift’s launch because its a problem I haven’t really noticed.
In any case, it is definitely not as big of a growing pain as we have seen with FreeSync, so far.
I’d personally consider the
I’d personally consider the DIY Kit a sort of ‘G-Sync Beta’. The final product worked as intended and Nvidia spent the time to make it right. It explains why the ROG Swift and other monitors arrived later that Nvidia initially promised.
Much rather wait a while and see everything working properly.
Allyn, why waste a second on
Allyn, why waste a second on these clown fanboys. You did your job right. You provided an unbiased full hardware review. There will always be those that just don’t want to believe. Even if they had the monitors sitting in front of them showing the exact same issues you outlined in your review, they would still not believe. It’s called denial and can be contributed to mental illness in some cases. Regardless, keep up the great work!
UGH ofcourse this drops
UGH ofcourse this drops somthing. Too good to be true from the start
Good old AMD,
Add this to the
Good old AMD,
Add this to the list:
Broken Enduro cover-up.
Crossfire Frame pacing issues they didn’t know about.
Buying an ad laden copy of Geforce experience.
Freesync crossfire “Coming Soon™”.
Did I miss anything else?
TrueAudio??? Where is that?
TrueAudio??? Where is that? Is it even used yet LOL
TrueAudio??? Where is that?
TrueAudio??? Where is that? Is it even used yet LOL
LMAO…”Coming Soon” with
LMAO…”Coming Soon” with trademark from AMD = awesome!
lol at this mess of a comment
lol at this mess of a comment section. Grow up, buy Nvidia, their products work and you get what you pay for.
Damn straight. AMD/ATI was
Damn straight. AMD/ATI was good about 10 yrs ago. Move on to better things dumb asses!
Holy cow, that’s a lot of
Holy cow, that's a lot of anger in here.
Allow me to let you in on a secret… these are technology companies that are trying to sell you products. Very rarely are these products perfect. FreeSync is still really new, and they are still working out bugs and exact implementations.
These companies are not your friends. Ryan has taken a pretty even handed stance on these products and reports these findings to you, the readers, for free. G-Sync is more polished, has more features, but costs a whole lot more money. FreeSync does not add much cost to a monitor, is out later, and hasn't been worked on nearly as long as NVIDIA and their hardware implementation. It has limitations. Doesn't mean it is worthless, but this is a case of "buyer beware".
AMD has had successes (original Athlon 64, the original Athlon, a wide variety of excellent video cards) and some seriously mediocre products (original Bulldozer, Radeon HD 2900 XT). Just so happens that AMD is being hampered by their CPU offerings and it has a detrimental effect on R&D for everything else. But we have to admit… the old HD 7970/R9 280X have been stalwarts for the past 3 years.
Anyway, relax, and take this information for what it is worth. It is a limitation in the implementation, not some strange bias from Ryan who delivered the information.
As the better tech, I like
As the better tech, I like Gsync. Will I buy into it? Hahahaha… not with that premium they want for it. I still WANT AMD to surpass Nvidia in the capability of freesync. When that happens, it’ll give Nvidia to then surpass AMD. The competition is good, and anyone should be ashamed if they truly want one of the big two to completely fail. Why would you wish stagnation upon technology?
Still, I’ll live happily with my good ‘ol overclocked 120hz IPS display. Both variable refresh technologies have my interest piqued, but I don’t like how either one is being done yet. When they are, I’ll buy into it. Until then they can just keep trying and I’ll enjoy watching the show.
“When they are, I’ll buy into
“When they are, I’ll buy into it.”
Really? So this means you have no idea what you are talking about. Those who actually own a FreeSync or G-Sync monitor will tell you first hand that the current dynamic refresh rate technology really does work and reall is a game changer. Those are the FACTS and are not based upon some guess based off assumption or a 5 minutes test drive at a local computer store. Seeing is believing. Sometimes you have to take the chance for the greater good and act. Talk the talk AND walk the walk.
There is no objectively
There is no objectively better between FreeSync nad G-sync. Only subjectively one can decide between visible ghosting and inputs limited to single Display Port. The same is true for choosing open standard or better handling low range of refresh rates.
With that in mind I would like to see more best of each class monitor comparisons. Thanks.
Sorry, that’s not really
Sorry, that’s not really true.
All of these VRR panels carry a hefty premium, contrary to what AMD said, even their own FreeSync panels cost hundreds more than a non-VRR panel of similar characteristics.
So now that we have established the primary benefit and cost-driver of this panel premium is in fact VRR, the primary comparisons should be made in VRR mode. That is not to say the additional inputs are worthless, but again, if you wanted additional inputs, there are a number of other choices for much less money, and most likely, you already own a few with multiple inputs that can’t do VRR.
If you are strictly looking at behavior and performance under VRR, one can objectively say G-Sync is the better of the two. All other considerations you’ve brought up (open standards, add’l inputs) are subjective and based on preference, that really have no bearing on actual performance and ability to accomplish the goals of a VRR panel.
Wait, you mean that we don’t
Wait, you mean that we don’t get free monitors with Freesync? What a hack.
What’s funny is AMD did at
What’s funny is AMD did at one point say FreeSync could be firmware flashed, for essentially free, in part of the BS FreeSync run-up where they wondered why Nvidia was charging so much for the G-Sync module, and why G-Sync capable panels were so expensive.
Indeed, this led many AMD fanboys to interpret this FUD/misinformation to mean all monitors would be FreeSync compatible one day, but only later, does AMD retract/recant their misleading statements to say it is just “royalty” free and that panels will in fact cost more due to more expensive components, like scalars and VRR capable panels, backlighting etc.
I’m sure it wouldn’t take too much effort to find some posts by “remon” stating similar nonsense, if you post under that handle frequently.
AMD also bragged about 11 new
AMD also bragged about 11 new FreeSync monitors that will be available by March 2015:
http://wccftech.com/amd-11-freesync-monitors-march/
Hmmmm, let’s check the real facts. It’s now May 2015 and a whopping TWO FreeSync monitors have been released and they both have only TN panels – no IPS or AHVA panels to be seen. The BenQ XL2730Z and the just released Acer XG270HU. And the soon-to-be released Asus MG279Q is still missing in action. So, what happened to the other 9 models? I hear crickets in the AMD camp…LMAO.
To me VRR is just one of many
To me VRR is just one of many monitor features. I do not even consider it the most important. I buy what I need and I will not follow blindly some marketing push from GPU vendors when choosing my next monitor.
As for the price, it is not that important. I saved some by not buying gold watch from Apple 😉