Introduction, Specifications, and Packaging
We finally have a FreeSync monitor that we can recommend! The new ASUS MG279Q offers the best VRR range yet and combines that with an IPS screen.
AMD fans have been patiently waiting for a proper FreeSync display to be released. The first round of displays using the Adaptive Sync variable refresh rate technology arrived with an ineffective or otherwise disabled overdrive feature, resulting in less than optimal pixel response times and overall visual quality, especially when operating in variable refresh rate modes. Meanwhile G-Sync users had overdrive functionality properly functioning , as well as a recently introduced 1440P IPS panel from Acer. The FreeSync camp was overdue for an IPS 1440P display superior to that first round of releases, hopefully with those overdrive issues corrected. Well it appears that ASUS, the makers of the ROG Swift, have just rectified that situation with a panel we can finally recommend to AMD users:
Before we get into the full review, here is a sampling of our recent display reviews from both sides of the camp:
- ASUS PG278Q 27in TN 1440P 144Hz G-Sync
- Acer XB270H 27in TN 1080P 144Hz G-Sync
- Acer XB280HK 28in TN 4K 60Hz G-Sync
- Acer XB270HU 27in IPS 1440P 144Hz G-Sync
- LG 34UM67 34in IPS 25×18 21:9 48-75Hz FreeSync
- BenQ XL2730Z 27in TN 1440P 40-144Hz FreeSync
- Acer XG270HU 27in TN 1440P 40-144Hz FreeSync
- ASUS MG279Q 27in IPS 1440P 144Hz FreeSync(35-90Hz) < You are here
The reason for there being no minimum rating on the G-Sync panels above is explained in our article 'Dissecting G-Sync and FreeSync – How the Technologies Differ', though the short version is that G-Sync can effectively remain in VRR down to <1 FPS regardless of the hardware minimum of the display panel itself.
Specifications:
Display |
Panel Size(diagonal) |
27" (68.5mm)Wide Screen (16:9) |
||
Display Viewing Area(HxV) |
596.74 x 335.66 mm |
|||
Panel Backlight/ Type |
WLED/ In-Plane Switching technology |
|||
Display Surface |
non-glare |
|||
Color Saturation |
100% sRGB |
|||
True Resolution |
2560 x 1440 up to 144Hz (DP1.2) |
|||
1920 x 1080 up to 120Hz (HDMI1.4) |
||||
Pixel Pitch |
0.233 mm (109ppi) |
|||
Brightness |
350 cd/m² (typical) |
|||
Contrast Ratio (Max.) |
1000:01 |
|||
ASUS Smart Contrast Ratio(ASCR) |
100,000,000:1 |
|||
Viewing Angle (CR≧10) |
178°(H) /178°(V) |
|||
Display Colors |
16.7 million |
|||
Response Time |
4ms (Gray to Gray) |
|||
ASUS EyeCare |
Yes |
|||
Video Features |
Trace Free Technology |
Yes |
||
Game Visual |
Yes (GameVisual: FPS,RTS/RPG, Racing , sRGB, Cinema, Scenary modes) |
|||
Skin-Tone Selection |
3 modes |
|||
Speakers |
2-Watt x2 , RMS |
|||
Color Temperature Selection |
4 modes |
|||
GamePlus/Blue Light Filter |
Yes (GamePlus/Blue Light Filter) |
|||
Gaming Hotkeys |
5-way Navigation OSD Joystick |
Yes |
||
Crosshair / Timer |
||||
GamePlus hotkey |
||||
GameVisual |
6 modes (Scenery, FPS, RTS/RPG, sRGB, Racing, Cinema) |
|||
Input/ Output |
DisplayPort1.2, mini-DisplayPort1.2, HDMI/MHL2.0 x 2, 'USB 3.0 ports (Upstream x 1, Downstream x 2), Earphone Jack |
|||
Signal Frequency |
Digital Signal Frequency |
51.2~221.97KHz(H) / 35~144Hz(V) |
||
Power |
Power Consumption |
<38.7W (Energy star 6.0) |
||
Voltage |
100–240V, 50 / 60 Hz |
|||
Mechanical Design |
Chassis Colors |
Matted black |
||
Tilt(angle) |
+20° ~ -5° |
|||
Swivel(angle) |
+60° ~ -60° |
|||
Height Adjustment (mm) |
0~150 mm |
|||
VESA Wall Mounting(mm) |
100 x 100 mm |
|||
Security |
Kensington lock |
Yes |
||
Dimension |
Phys. Dimension (WxHxD) with stand |
625 x 559 x 238 mm |
||
Box Dimension (WxHxD) |
625 x 368 x 63 mm |
|||
Weight |
7.3Kg (Net Weight) |
|||
Accessories |
Power Cord, DisplayPort-to-miniDP cable, USB 3.0 cable (optional), MHL cable (optional), DisplayPort cable (optional), HDMI cable (optional), Warranty Card, Quick start guide, Support DVD |
|||
Regulation Approvals |
Energy Star 6.0, UL/cUL, CB, CE, ErP, FCC, CCC, BSMI, CU(EAC), C-Tick, VCCI, J-MOSS, PSE, RoHS, WEEE, Windows 7 / 8.1 WHQL |
|||
|
Packaging:
The MG279Q came well packaged with all necessary cords. One oddity noted was that the DisplayPort cable was actually a DP to Mini-DP cable.
Along with the power, HDMI, and USB cables was an instruction manual and a mystery red clip. More on that and the Mini-DP cable on the next page.
So in summary, the whole time
So in summary, the whole time you were testing this you were probably thinking:
“Sheesh, for $200 more than the $1250 I spent, I could’ve gotten a 980Ti and a better G-Sync panel.”
Haha do you ever provide
Haha do you ever provide anything contructive that is not related to trolling?
What if you are like myself and have a general dislike for Nvidias business practices regardless of the quality of products they produce…Or what if you like GPUS that are the best performance/$ winners (my R9 290)? Or what if I was trying to justify to my wife a $600 monitor or $800 one?
AMD has an exciting future with DX12 and their very affordable 6 and 8 core CPUS that may benefit or the better gpu performance. And even if they don’t I still enjoy supporting them..
Then you get inferior AMD
Then you get inferior AMD products, as you deserve. Sure you save a buck or two, but you also get 2nd-rate product.
Haha was about to say the
Haha was about to say the same to him. But diehard AMD fans keep proving dumb antics over and over again, so whats new?
Great Tech is not cheap and I rather spend that much on the better product thank you very much!
Yeah I mean he just outlined
Yeah I mean he just outlined something I simply wouldn’t be happy with, what’s the point if you’re already spending that much money, in not buying parts that are only marginally more expensive, but clearly better when it comes to both performance and features.
8350 vs. i7?
Fury X vs. 980Ti?
FreeSync vs. G-Sync?
You’re looking at maybe $300 diff in price but the more and more cheaper concessions are made, the lower performance and end-user experience degrades.
Well what if someone like me
Well what if someone like me didn’t feel like wasting $200 for specs on a monitor outside of what I will EVER play on? I wish the refresh of this was 30hz but I will NEVER play a game below that and will never play over 90hz with modern games at max settings at 1440p…Don’t you realize there are a ton of people in my exact situation which is what Asus knows and priced accordingly for…?
How do you know you’ll never
How do you know you’ll never play out of those specs, as if you have control over frame rate drops, especially on some of these panels that are 1440p and above. Do you even know what kind of GPU power you’re going to need to run at least 30FPS on these panels being discussed?
1440p is going to need 290 *MINIMUM* to drive 30+FPS and you’ll still have to turn things down.
2160p/4K is going to need Fury X at *MINIMUM* to drive 30+FPS and same thing, you’re going to have to turn things down.
One of the main goals and problems solved by VRR when Nvidia invented it was that you WOULD NOT have to turn down settings to get a perfect, tear and input lag-free experience even when your FPS dropped below native refresh.
And while its great that you don’t THINK you will ever fall outside of those bands, that doesn’t change the fact there are COMPETING PRODUCTS THAT NEVER SETTLE for these kinds of limitations that are priced slightly more and do a BETTER JOB of handling VRR at both higher and lower refresh rates with less ghosting and less input lag as well.
So yes, for someone with low standards, like yourself, this panel will be good enough, but for someone who wants and demands more, $200 will be a small premium to pay for superior results!
Hey everyone look at this
Hey everyone look at this mentally feeble green teamer who has nothing better to do with his life than submit to the nvidia marketing department and throw extra money at them. “Slightly” lol.
As if a gamer would accept framerates below 30, disgusting. Seems like you have the low standards with your casual games.
Calling names makes you look
Calling names makes you look bad as well, so maybe let’s just keep things constructive?
And the point about 30FPS is very valid as we often get drops we have no control of. This is why we have a FRAME TIME analysis now in addition to frame rate, and also why we look at the MINIMUM FPS score in addition to the average.
If you want to avoid a LOW frame rate score or get more solid frame times you have to drop the settings or buy better hardware which then defeats some of the purpose of buying an asynchronous monitor.
Nothing was constructive
Nothing was constructive there, that post was all the response he deserved. Quit feigning civility where it wasnt asked for.
Dude. Don’t you know? The
Dude. Don’t you know? The AMD 8350 totally competes with the i7 in gaming experiences.
maybe a i7 850 at like 2 ghz
maybe a i7 850 at like 2 ghz lol my i7 4790k DESTROYED THEM
I have other Hobbies that
I have other Hobbies that cost far more money than PC gaming at my level. Even with your 5% better gaming experience I am saving 25% of my money on a monitor alone…Acer gsync panel vs this one. I also have an R9 290 that was $315 back when I got it almost 2 years ago and is able to still play all of the main titles totally fine at decent framerates with mainly max settings. AMD cards age far better than Nvidia…Just look at how the 780ti plays today when it was $700+ just over a year ago..
I know Nvidia products generally perform “better” but the premium for me is not worth it at all..It seems like most people who have Nvidia assume EVERYONE with AMD is sad and gets choppy framerates and doesn’t enjoy themselves when they get a “inferior” product when just recently NV pulled ahead.
Are you playing on SLI right now with you Gsync experience? If not than why do you need it. Same thing goes for me. I only ever want to play with one GPU and avoid all the hassle of SLI/CF profiles.
Then there is all the future talk of DX12 with AMD supporting the more important lower level functions etc etc but that is a whole other topic that we don’t know much about yet.
And that’s great! The halo
And that’s great! The halo performance segment has always carried a hefty premium, that will never change so if you want to stay below that 10-20% range and settle for 2nd best, that’s fine and good, but don’t sit here and try and say there is no difference between the products or one isn’t worth it because just because YOU don’t see the benefit doesn’t mean it isn’t important to someone else.
I do actually use SLI with my ROG Swift actually because a single Titan X wasn’t enough, I still found myself turning stuff down when I didn’t really want to. With 2xTitan X, its no compromises and it just works!
As for the 780Ti vs. 290/X again, I buy cards for how they will perform until the next time I can upgrade. As much as AMD fans love to buy a card as if it is a marriage, I know my next upgrade is just 24-36 months away, so no I don’t put a whole lot of stock on how a card will perform in 2 years because chances are it will be too slow regardless and I will want to upgrade again. I buy cards for CURRENT and near future performance, not how well they aged when I’m ready to get rid of them.
See…MUCH better
See…MUCH better response…I can dig it, no trolling but I never mentioned there is no difference between the products.
Clearly you are a no compromise gamer. That is fine. I am a “put $10,000 into my STi” type guy when I will likely never actually compete.
I just want people to realize there is a value where my PC currently stands and it makes little performance compromises for big cost savings.
But of course I’m going to
But of course I’m going to throw some snarky jabs in there to tease all the AMD fanboys, look at all the nonsense they spout in every single one of these reviews? What’s the point of saying I told you so without having some fun with it? 😀
The problem is it is never an acknowledgement that a premium is justified, first thing AMD fanboys say is “G-Sync is not worth the premium”, but in reality, that’s just them projecting their penny-pinching budget conscious value system on others.
More discerning individuals like myself understand what G-Sync and VRR sought out to fix, and once both technologies were featured and reviewed, many of us saw the potential. We relied on INDEPENDENT REVIEWERS to give us an honest account before we made the decision to buy into the tech or not.
We also quickly saw the red flags when AMD claimed they had not only came up with a suitable alternative, but one that was better than G-Sync in numerous ways. Except we could see, that wasn’t the case, and FreeSync still has many flaws today that mean it FUNDAMENTALLY fails to address many of the key points of VRR related to V-Sync and double/triple-buffering.
I’ll outline them here for your reference:
1) Screen tearing
2) Input lag
3) Framerate halving below max/native refresh rate.
I just got this monitor last
I just got this monitor last night and I can say without a doubt coming from my older asus TN panel that this has…NO input lag that I can notice and no screen tearing. Now below 35fps you can slightly see the lack of smoothness but it is no different than running my previous monitor at that fps.
I believe he’s referring to
I believe he’s referring to the input lag coming from when you’re forced to game outside of the “freesync” window of 35-90Hz. If you game inside the window, you are basically golden.
If outside, the animation’s smoothness difference can be so jarring that input lag may come into play.
Your use of the reflexive in
Your use of the reflexive in the first sentence of the third paragraph is incorrect. Don’t worry; I blame the system.
you’re saving 200$ on the
you’re saving 200$ on the monitor alone, in the eu you are saving ~190€ .
add that to the fact that you don’t need a 980 ti or fury X to run this, you can use a cheap R9 290 and have a good experience.
You do realize that if you
You do realize that if you fallback to old cards and backwards compatibility, FreeSync is going to lose every time to Nvidia right? Every single Kepler card is supported for G-Sync while AMD limits support to GCN 1.1 and newer. Also, still no CF support for AMD, while any Kepler config in SLI is still a very real option for Nvidia users.
No, instead of getting a
No, instead of getting a slower & louder GTX 980 + GSync monitor
You can get the faster Fury X + FSync monitor AND still save about $100
Less money, faster, quieter… its a no brainer.
Also FreeSync is the new industry standard… good luck with your gsync monitor in the future. You lock yourself , and nvidia might not support it a few years from now.
Why are you comparing to the
Why are you comparing to the 980 and not the 980 Ti, which is (in most games) faster than the Fury X?
Do agree that the Fury X is quieter under load, but that pump whine is certainly louder than the 980 Ti at idle.
Good thing games aren’t
Good thing games aren’t played at Idle right ?
Well the thing is that we
Well the thing is that we tend to not notice GPU noise (regardless of card) when gaming as you're listening to the game sounds / music, but then when you're done playing and everything quiets down, you are left with that pump whine, which tends to wear on you (especially in Crossfire).
Well, because AMD fanboys.
Well, because AMD fanboys.
As if you aren’t an nvidia
As if you aren’t an nvidia fanboy?
I’m a fan of the best and I’m
I’m a fan of the best and I’m completely comfortable in my own skin in that regard. Nvidia provides the best PC gaming experience today, if and when that changes, I’ll buy something else, but I doubt that will happen anytime soon!
What’s the excuse of all these AMD fanboys? Oh right, bargain, cheap, almost as good etc.
because you compare things at
because you compare things at the same price allyn.
a 980ti and a gsync monitor cost $200 more, so to offset that cost you would have to buy a cheaper GPU.
If money is no object when comparing products then what is the point.
listen I love this site and look forward to the podcast every week but your constant fanboyism for nvidia is tiresome.
is that coil whine that bad at idle in a good enclosed case while under a desk.
look I currently run nvidia products but I do not think it adds to AMD fans wanting to read your articles when you do not take into account apples to apples comparisons.
He’s the worst nvidia fanboy
He’s the worst nvidia fanboy out of the whole group. It seems josh walrath is the only one that even understands anything about the architectures amd and nvidia employ. Unfortunately we only ever get to hear from the least knowledgeable of the group most of the time.
You should see the comments
You should see the comments when Ryan from Anandtech rips Chizow a new one.
It was hilarious.
That was indeed hilarious and
That was indeed hilarious and a welcome move from the more balanced Ryan. I have to agree 100% with FKR that this sites Nvidia bias is tiresome at best. All the more so when they attempt to “appear” to be impartial. As for Chizow Shill his pro Nvidia troll posts are universally ridiculed at every tech site I have visited and rightly so. The guy has issues…
Hahah what is funny something
Hahah what is funny something I’ve noticed over the years. Nvidia fans when encountered with a problem, will raise hell and draw attention to it, usually directly to Nvidia on their forums until the problem is fixed.
Meanwhile, when a problem is brought to light with AMD products, the only thought that goes through their fanboys’ minds is how quickly they can damage control and sweep the problem under the rug.
Apparently you AMD fanboys don’t understand that YOU are the only one’s who are negatively impacted by AMD being lazy with their solutions, but more likely, its because you’re not even users of the actual products, you’re just sideline cheerleaders of a product, just because. True fanboys, by definition.
We’ve seen it 2x recently and your stupid anon handle is a good indication of your position on both.
FCAT/runtframes. Nope, not a problem, “PCPerversion” is lying obviously a shill site for Nvidia. Lo and behold, they are the driving force in getting your junk CF solution fixed for you.
FreeSync, all kinds of problems with ghosting, OD broken, stutters/odd behavior at the low end of the VRR window. “PCPerversion” is lying, its not true because AMD said so, obvious Nvidia shill site. Lo and behold, after more testing and confirmation AMD admits the problem and says they are working with panel makers to fix it.
See a pattern here? Yep. AMD fanboys areignorant and would rather sweep a problem under the rug to their detriment rather than demand better from AMD.
I fully expect a nonsensical “hueer dueerr you’re a fanboy” reply in response to my cogent explanation of why I think we as an enthusiast tech community are better off without the tech bottomfeeders that overwhelmingly prefer AMD.
You are what you hate…Don’t
You are what you hate…Don’t you see that you dumb monkey?! You enjoy trying to piss of anyone who chooses AMD and generalize all of them into one category. Sure there are some AMD dudes that are self-absorbed high-cost-regret-insecurity assholes like yourself but you almost tip the scales. There are shitty AMD fans and shitty Nvidia fans, but it doesn’t mean everyone is.
Im an AMD fan that loves PCper. They get some kickbacks from Nvidia but they still report on facts.
Not only is this the best monitor AMD has worked with, but PCper highly recommends it and even a 980ti owner bought it because it was $200 cheaper and he didn’t care about gsync(on another forum). Clearly you don’t care about forward progress and enjoy employing asshattery all day on forums. Feel sorry for you bud.
I don’t think I will ever understand idiots who blame an entire community for the actions of one, in the comments section of a review about a PC gaming monitor…hahaha
LMAO, are you serious right
LMAO, are you serious right now? Look at all the AMD fanboys blaming Allyn/Ryan for shilling for Nvidia because they critically reviewed a product and backed it up with science, as any GOOD REVIEWER would do as a SERVICE TO THEIR READERS to allow them to come to an INFORMED DECISION based on the MERITS OF THE ACTUAL PRODUCTS.
Its truly amazing, even you claim they are getting kickbacks to trash AMD products as if you hope to get inferior products with all of these flaws and deficiencies relative to the competition.
You can claim all you like it is a generalization but READ the comments from the AMD fanboys and proponents, its all the same. Ad hominem attacks on the messenger, rarely if ever do they actually tackle the technical issues, much less address them or voice their concern for them.
And we wonder why AMD has gotten away with this broken model of support and overall lackluster strategy of throwing out some half-assed, poorly supported standard, while taking zero accountability for problems when they occur. No thanks, the sooner we are rid of this tech bottomfeeder business model, the better for all of us.
Not kickbacks to trash AMD! I
Not kickbacks to trash AMD! I spoke with Allyn and he squashed my misguided understanding of Ryan’s relationship with Nvidia. Ill own it right now..
I get your point with people bashing PCper. Not what I am saying. I was trying to get you to realize that you post 10 straight up hateful trolling comments before you actually post the really valid points…Your method of delivery I guess is the weak link not your points.
Nah as you can see in this
Nah as you can see in this comment thread and others, it really doesn’t matter how I deliver the message, at least the trolling gets some laughs and I certainly get a kick out of contriving them. 🙂
AMD fanboys have this warped perception that any negative or critical press for AMD is a huge detriment to AMD and by proxy, themselves when in reality, it only impacts the product they take home and use.
As for Allyn and Ryan, I can see their cynicism and ambivalence towards AMD stems from the simple fact people don’t like to be lied to or misled, or treated if they are stupid. AMD and their reps will LIE TO YOUR FACE and think nothing of it. It’s the same reason I think so little of AMD and their brand at this point.
It’s literally cringeworthy watching AMD reps make some of these claims, then go back just a few weeks later and ask yourself, did any of this happen?
-Did HBM allow them to make the world’s fastest GPU?
-4GB isn’t a problem for 4K optimized gaming?
-Fury X is convincingly faster than 980Ti across the board as their internal benches indicated?
-Is Fury X an Overclocker’s Dream?
-Is the liquid cooler on Fury X really a benefit that enables cool and quiet operation when many of the pumps are shipping with an annoying whine?
Just some questions for critical thinkers to evaluate in regard to AMD. 🙂
How are we being impartial
How are we being impartial exactly? Are we making up benchmark results? Are we making up the differences between G-Sync and FreeSync? …or is it that you just don't like hearing facts?
I don’t know Allyn. We never
I don’t know Allyn. We never heard how G-Sync had input lag. Yet here was Tom Petersen on your live stream saying exactly that.
https://pcper.com/news/Graphics-Cards/PCPer-Live-GeForce-GTX-980-Ti-G-Sync-Live-Stream-and-Giveaway
What we did hear from you and PcPerspective collective was a verbatum of Nvidias talking points that were contradictory to what Tom Petersen voluntarily said without being questioned.
Either Nvidia and Tom Petersen where playing for fools all this time with marketing or you never bothered to question it with the contradicting evidence from BlurBusters on the matter.
Sorry, while Allyn and Ryan
Sorry, while Allyn and Ryan are more than capable of defending themselves, your statements are simply inaccurate.
Ryan has covered input lag at max refresh numerous times with Tom Petersen, and he has never once denied that at max refresh G-Sync behaves very similar to V-Sync On.
https://www.youtube.com/watch?v=Z_FzXxGVNi4#t=44m20s
However, Ryan to his credit as usual, has mentioned that some gamers might actually prefer to have Vsync OFF at the top-end above max refresh to avoid any potential input lag, given tearing is generally less noticeable at high frame rates. He also notes in later videos/articles/interviews, that this is something FreeSync is capable of and one of the only real deficiencies he felt G-Sync lacked over FreeSync.
Petersen said their goal for G-Sync was to NEVER have any tearing but he did see the merits of that viewpoint and as always, he was open to the idea of implementing this into G-Sync as they are always looking to improve it. Anyone who knows Tom Petersen’s plausible deniability methods knows this means there’s a good chance this gets implemented at some point and they are already looking into it.
And lo and behold, the fruits of open, honest, and constructive feedback! But for those that prefer Nvidia, this is a big reason why we prefer Nvidia. So yes, Ryan can take a bow for this positive change to G-Sync and as always, Tom Petersen is the best in the business.
http://www.geforce.com/whats-new/articles/g-sync-gets-even-better
“For enthusiasts, we’ve included a new advanced control option that enables G-SYNC to be disabled when the frame rate of a game exceeds the maximum refresh rate of the G-SYNC monitor. For instance, if your frame rate can reach 250 on a 144Hz monitor, the new option will disable G-SYNC once you exceed 144 frames per second. Doing so will disable G-SYNCs goodness and reintroduce tearing, which G-SYNC eliminates, but it will improve input latency ever so slightly in games that require lighting fast reactions. ”
Amazing how easy and painless change can be when a company is open and forthright about their products, why does the dialogue for change never go this well with AMD and their fanboys? Its always resist, deny, finger pointing, name calling, etc. and as a result their products that they use end up being the ones that suffer.
We were never able to
We were never able to reproduce any additional input lag in measurements. If there is any, it is immeasurable / negligible. Also, TFTCentral has measured the the Swift and Acer Predator (both G-Sync) as the lowest latency panels they have tested. AMD was passing out some PR guidance to reviewers showing supposed lower frame rates in G-Sync vs. FreeSync (implying additional lag), but this was not reproducible on our end. Some other sites just repeated those AMD slides but I have yet to see any independent testing confirm it.
Jajaja
You don’t take Tom
Jajaja
You don’t take Tom Petersens word for it.
What B.S. Allyn
TFT Central
What B.S. Allyn
TFT Central measures fixed latency.
AMD Slides came pre-FreeSync monitor release.
BlurBusters and users pounded the issues from day 1 of the G-Sync Kit. That’s +1yr to market of ignorance from PCPerspective and now your deflecting the issue when Nvidia Tom Petersen not only mentions it but repeats it. Not only that he also points to a differences of his own products.
he/she/it is trolling on so
he/she/it is trolling on so many different tech sites, hard to believe he/she/it has time go out and make money to pay for isp and electricity, unless that is how he/she/it makes money? lol. I had heard about that article before but never looked it up, you got a link for that comment section?
You AMD fanclowns can’t even
You AMD fanclowns can’t even keep your comments straight. I guess you are referring to Jarred Walton’s comments in reply to me ripping his FreeSync slidedeck republish where he foolishly declared the two technologies equivalent while writing an Engadget review.
Then when pressed about the differences, many of which were brought to light with advanced techniques Allyn and Ryan used here at PCPer, he used the “dog don’t have an oscilloscope” excuse when it came to actual scientific testing. I simply called him out for writing a piece that was a huge disservice to the tradition of AT, sorry it rubbed you and other AMD fanboys the wrong way! 🙂
Hilarious, its funny how he hasn’t written anything at AT since then, maybe he decided to actually go write for Engadget, instead.
The problem here is that you
The problem here is that you are one of the absolute worst at spreading misinformation and downright trolling, and you do it all over various tech review site forums and not just PCPer.
I read those comments on that Anandtech article: and you are wrong and Jarred is right, it’s that simple.
Do yourself a favor(and the rest of us too), take your whole anti-AMD bit and turn it down several notches(if not off all the way) it makes you look desperate and childish. (note, this last sentence, is in reference to how you present yourself and hte way you behave)
on topic, this latest offering looks very compelling, but I still think they should have left this particular monitor in the development labs awhile longer until better scalars are ready.
What misinformation? Please
What misinformation? Please feel free to outline them here, as I’m more than happy to point to references, including the work done by Allyn here at PCPer that clearly and definitively show FreeSync is NOT equivalent to G-Sync.
Indeed, if AMD and their fanboys weren’t so desperately trying to misinform everyone into thinking FreeSync is an equal alternative to G-Sync, I wouldn’t have to say a thing!
On topic: So you can admit, that FreeSync just isn’t as good as G-Sync and thus, G-Sync is fully deserving of any premium. That is certainly a start, you can start by thanking PCPer for making that clear to you and others, because Jarred and AnandTech certainly were not up to the task.
I like how these guys accuse
I like how these guys accuse me and Ryan of being Nvidia fanboys in the comments of a review of a FreeSync panel that we are actually recommending.
Well, you know you did try to
Well, you know you did try to fix their broken ass products on several occasions by bringing them to light, surely that makes you an enemy and fanboy in their book.
Again fkr, you don’t place a
Again fkr, you don’t place a premium when products are better? Who cares if something costs more when it clearly carries additional benefits?
980Ti vs. Fury X, same price, 980Ti carries numerous benefits starting with being faster, more overclockable, and more VRAM to run the games and resolutions you would expect from a flagship card. And that’s before you get into the ecosystem and support benefits of one vendor over the other.
Then you get into G-Sync vs. FreeSync, again, if your primary purpose is to buy a panel for VRR and keep it for a few years, why settle on a FreeSync panel that still has worse ghosting/overdrive, has minimum VRR windows (outside of which VRR breaks), has lower max refresh rates, and also has worse input lag within your VRR window?
G-Sync costs more because everything behind it is BETTER.
I am not just a gamer tho. I
I am not just a gamer tho. I like to do folding @ home and also have to decode allot of video. i think that the furyx may work out really well for those tasks.
as to who cares about things costing more, I do. I have kids and college funds. swimming lesson, horse lessons, fcking ninja lessons son.
It’s not ‘coil whine’, it’s
It's not 'coil whine', it's the pump. The sound certainly gets to some people from the 'sounds' of it. We are working on a short piece comparing the sound profiles of the different cards, but trying to present something so subjective in a precise manner is taking some time.
Yep, the same pump whine at
Yep, the same pump whine at least 3-4 other reviewers commented on, but AMD assured didn’t make it to retail parts, yet we see numerous reports of users on youtube and forums stating this is a problem. Now AMD is saying it will be fixed in FUTURE revisions….fool me once shame on me…fool me twice…
But yeah I think that’s a key thing dB readings don’t pick up, the pitch of a sound and the annoying nature of low frequency noise, like buzzing from a low Hz pump for example.
LMAO @ Chizow!
Reading your
LMAO @ Chizow!
Reading your posts is a RIOT EVERY SINGLE TIME DUDE. Thanks!
But I fear for your health… you need to relax bro, take a chill pill or something.
Getting so fired up can’t be good for you despite the fact that it’s so amusing to watch.
Try repeating over and over to yourself, “It’s just a graphics card, not the most important thing in my life.”
If that doesn’t help “Perhaps Allyn or Ryan could recommend a good psychiatrist?
Hey look, more nonsensical
Hey look, more nonsensical rubbish from some anon AMD fanboy posting under a troll pseudoynm.
I know its easy to laugh at all the BS AMD spouts and their fanboys lap up, every last bit of it. Glad you’re enjoying the show!
Allyn your nVidia bias is now
Allyn your nVidia bias is now confirmed. The pump whine was only in early batch / reviewer products and you know this and/or you don’t know for sure that pump whine exists in newer Fury X parts. Using that argument against AMD is appalling.
LOL seriously? This exactly
LOL seriously? This exactly why I can’t stand AMD and their fanboys, its like you all WANT cards with pump whine.
https://www.youtube.com/watch?v=dK255ofJPhk
https://www.youtube.com/watch?v=yLW7cZPW2fA
https://www.youtube.com/watch?v=xR3TZxu9vJI
https://www.youtube.com/watch?v=yLW7cZPW2fA
Philosophical question:
If a Fury X pump or coil whines, and no AMD fanboy admits to hearing it, does it make a sound? 😀
My argument is backed up by
My argument is backed up by the additional pair of Fury X's sitting in our office – both of which are *louder* than our sample. We got those retail units to confirm what AMD had told us (that it was fixed in retail). Apparently it wasn't. What you are interpreting as Nvidia bias is actually AMD repeatedly having to go back on what they tell reviewers.
OK point taken, pity it had
OK point taken, pity it had to be pried out of you. Next time explain the facts in full the first time and you may sway people to your inner green leanings.
Another option could be that
Another option could be that you no jump to wild ass conclusions and name calling when you don’t have ANY facts at all.
Instead of taking the time to
Instead of taking the time to back up each and every little point being made in comments, we were busy taking sound measurements and collecting other data for the upcoming post about it. Sacrifice necessary for the greater good I suppose.
Edit: Here are those results: https://pcper.com/reviews/Graphics-Cards/Retail-AMD-Fury-X-Sound-Testing-Pump-Whine-Investigation
Who needs luck when Nvidia
Who needs luck when Nvidia has done a better job of supporting a superior technology that they invented, out of the gate?
Is AMD doing a good job blaming monitor partners and vendors when they design a half-baked spec and it ends up being broken, requiring firmware and driver fixes to come to a point its still not as good as G-Sync?
Is AMD doing a good job when they STILL haven’t delivered a CF FreeSync driver?
And as Allan already called you out on, why are you comparing to a GTX 980 when you can choose a 980Ti + G-Sync monitor and NEVER SETTLE for inferior AMD products? Sure it costs more, but its also better.
FreeSync isn’t an industry standard at all and based on the lackluster first wave of panels compared to the 2nd generation of outstanding panels from G-Sync partners at Computex, it is obvious the G-Sync will continue to be a premium option.
Oh and also, the ONLY option when it comes to mobile/laptop VRR gaming.
Pleas3 stop trying to allign
Pleas3 stop trying to allign yourself with every author of every article you comment on. Anyone with any respect for themselves and the rest of their reader base will ignore you and anything you have to say. You’ve been an nvidia troll in almost every tech website I’ve ever visited. You have a financial stake in nvidia or so you’ve mentioned in the past, therefore you’re no use to anyone and your opinions cannot be trusted.
Who said I aligned myself
Who said I aligned myself with every author, idiot? Do you not have the ability to critically think, engage and question what you read? I just happen to highly agree with Allyn and Ryan’s take on this issue among many others and their degree of skepticism and critical reviews of FreeSync are largely responsible for educating the masses on the topic and getting AMD to address their FreeSync solution’s deficiencies.
The fact you and every other AMD fanboy just wants to dismiss and sweep these problems under the rug is what is laughable, but unsurprisingly, you keep getting the same 2nd-rate half-baked solutions from AMD because they know quite well the kind of tech bottomfeeders they cater to in the marketplace.
ROFLMAO @ Chiz this is just
ROFLMAO @ Chiz this is just so good…
And this is what the typical
And this is what the typical AMD fanboy falls back on when cornered. That is the most amusing to me XD
lolol yep.
lolol yep.
He already explained why he
He already explained why he was comparing the 980 with fury x moron.
Yes and he stupidly proved
Yes and he stupidly proved he’s an AMD fanboy, just as you have for backing him.
Buying decision should hinge on GPU first and foremost, so obviously any non-idiot is going to make sure they make that decision first, so comparing a 980 to a Fury X is something only an AMD fanboy would do when anyone that wasn’t a devout AMD fanboy would just pick the much better 980Ti for the SAME MONEY.
Then once you’ve “locked in” that decision, the monitor decision becomes clear, why not spend a small premium on a much better VRR solution to take advantage of your faster GPU? If the difference is only $200 on a $1200+ expenditure, why not go for the better solution in all aspects? In that respect, the choice makes a lot more sense.
I did just that. Putting that
I did just that. Putting that much money upfront for this kind of setup which is a bit more I thought. If it gets me the better product and experience its worth every penny and I’m sooooo happy I did put up that little bit more for that Gsync monitor which makes my PC Gaming that much more enjoyable.
There is still no Freesync equivalent out there period, plus that ghosting and overdrive mention all over the net with proof was a big nono for me and others.
Exactly, why
Exactly, why settle/compromise and wait/hope for improvements that address FreeSync deficiencies when you can just buy G-Sync panels that don’t have any of these deficiencies, and have not since Day 1?
Its great to see PCPers work in publicizing the issues has paid off and allowed those actually in the market to make INFORMED decisions.
At 1440p i would nog get a
At 1440p i would nog get a fury x. That card seems to really underpeform at everything below 4k.
Still trolling ?
Didn’t you
Still trolling ?
Didn’t you learn anything when Ryan Smith from Anadtech put you in your place.
Haha yeah and then Ryan
Haha yeah and then Ryan conveniently sick so he didn’t join in on the parade of sites raining on Fury X’s review.
It’s all fail for AMD fanboys, just like I said.
Rebrandeon 300 series happened.
Fury X failed to impress, despite AMD’s bogus benchmarks it is slower than 980Ti
4GB is a problem
It most certainly is NOT an Overclocker’s dream.
List of rubbish goes on, but AMD only gets away with it because their target audience just isn’t smart enough to call them on it.
nice double post, quit using
nice double post, quit using ie
Really that is your best
Really that is your best counter…..lol IE? No comeback Nvidia banter? XD
To play gameworks titles at
To play gameworks titles at 30 hz no thanks!
Exactly….
Exactly….
Hello, its pleasant piece of
Hello, its pleasant piece of writing on the topic of media print,
we all know media is a fantastic source of facts.
Nvidia, if only current GPUs
Nvidia, if only current GPUs are capable, please start supporting freesync now.
If they supported frame
If they supported frame multiplication, then it seems like they could set the frame limits to the higher range. For 144 Hz display, it seems like they could set the minimum at 72 and frame multiply anything below that to stay in the 72-144 Hz range. Attempting to support 35 Hz directly at the panel seems to be difficult, but we know it is doable with a smart enough scaler combined with specific panels. Nvidia seems to limit the possible panels to use with gsync. I assume that it is too difficult to get the overdrive to work properly with some panels. This may be due to inconsistencies across the panel. Not every pixel will have the exact same response curve, especially across a wide range of frequencies. They may pick gsync panels based on consistency of this response.
It doesn’t seem like a bad idea to support frame multiplication at the video card, since it should have more information about when the next frame is ready than a scaler that is guessing based on limited statistics. There really isn’t much stopping scaler makers from implementing similar technology to what gsync does (frame multiplication plus variable overdrive) though and a sufficiently smart scaler should be able to guess quite well. I think AMD may believe that doing frame multiplication would just be a temporary solution until scaler makers update their hardware to properly support VRR interfaces. This would lead them to not want to implement it because the scaler makers would then not have a reason to add it to their scalers. I am leaning towards not adding it to the video card. Interfaces exist to hide such implementation details. Ideally, the video card would support a wide VRR window and the display would do whatever it has to to make this look the best (even frame multiplication) depending on the specific characteristics of the panel. The video card maker should not have to concern themselves with certifying panels. It should just work with whatever displays support the proper interface.
I don’t think Nvidia’s proprietary solution is good for the market. If you want to make a gsync display, you have to use Nvidia’s scaler; the gsync module is really just a smarter scaler. Display makers probably do not want a vendor lock-in either. This should set off some competition among scaler/controller makers but it will take some time for them to catch up. There is nothing stopping them from doing frame multiplication in the scaler with free sync just like gsync does. It will be a more expensive scaler, but probably not much more once they start making actual ASICs. I would expect NVidia to get rid of the FPGA eventually also, once production volumes are high enough.
It still seems like Nvidia offers the better solution currently, if you are okay with matching a gsync panel with an Nvidia card. Although this free sync panel shows more ghosting, how noticable is this during actual game play? I personally already have a very nice fixed refresh display, so I would wait until this settles a bit.
Nvidia doesn’t make the
Nvidia doesn’t make the G-Sync scaler Alterra does.
Intel recently bought Alterra and might pose a problem going forward depending on what Intel plans to do with Alterra.
Alterra makes FPGAs (field
Alterra makes FPGAs (field programmable gate arrays). These are just generic chips that can be programmed to perform the function that Nvidia wants. Nvidia is fabless, so they technically don’t “make” much. Nvidia almost certainly creates the programming for the FPGA. An FPGA is a good solution for low volume parts where it may not be worth it to make an ASIC. Also, FPGAs allow the functionality to be updated almost completely after the product is shipped. You would just write a different program into the ROM which stores the programming for the device which would be loaded on startup.
There are obviously limitations on the functionality of the FPGA. You can only emulate a design up to a certain size depending on how large the FPGA is. It also has a limited number of off chip connections with limits on the kind of signals they can produce. I would expect Nvidia to make an actual ASIC eventually. that is they would take the design and actually make a chip implementing the functionality rather than programming the design into an FPGA. Nvidia isn’t really limited to Alterra for FPGAs. They probably could relatively easily port their design to be programmed into a different FPGA.
Altera just makes an FPGA
Altera just makes an FPGA which can be programmed to do the work. I would be curious to know which FPGA it is. I just noticed someone reporting it as an Altera Arria V GX FPGA. These appear to be $200 to $500 in small quantities. Nvidia would get them a lot cheaper in large quantities, but it will still be quite expensive. In large enough quantities, it will be much cheape, to make an ASIC, but the initial cost is large.
Yes, frame multiplication
Yes, frame multiplication could be done in the driver. Nvidia is doing it for Mobile G-sync, so it's definitely possible.
I would expect to see price
I would expect to see price at the “PROS:”
Anyway, thanks for the review.
So, um, WTH is Color
So, um, WTH is Color Saturation, and why is the sRGB listed next to it? sRGB is saturation free. 🙂
The term you are looking for is Color Space. And the percentage it the coverage. This is important in the new television standards that come out. They won’t be using sRGB, they will be using BT 2020 coverage.
Saturation is not referenced in sRGB, other than its relation to the white point it uses.
Also, it might be time to start listing monitor bit depth again, and whether it is using frame dithering to achieve it. The new standards are requiring 10-bit color, with the new Blue-Ray standard also supporting 12-bit color.
As these displays become more common, they will trickle down into the PC market, if only because they will offer the best picture you can get.
I wouldn’t be surprised to see games remastered in Ultra HD, or HDR in a few years time.
That’s an awful lot of
That's an awful lot of analysis about what is likely just a bad translation of a table.
Agreed on the bit depth, but these gaming panels are going to be 24 bit (3×8 bit) for a while longer.
It took me less than 60secs
It took me less than 60secs to analyze the faults of the table, and I had to translate to be useful. So, yes, it was an awful lot of analysis that I shouldn’t have had to do.
Now, will you make it better next time? If yes, then it wasn’t wasted, and I enjoy helping make the quality of the site better, in my own little way.
1000:1 contrast…
VRR will
1000:1 contrast…
VRR will be relevant once it’s on something other than LCD.
Bought this monitor and I
Bought this monitor and I have been very pleased with it. AMD 290 driving it @40-75 fps in games. Freesync makes game play amazingly smooth. There is light bleed in the lower right corner though even in darker games I do not notice it; in fact even if i look for it I can’t see a color difference in anything other than pure black. Took brightness down to 70 and lowered green and blue down to 96 for calibration (red 100 still)
If you have an amd card and want freesync this is the monitor to get.
I’ll add that I upgraded from
I’ll add that I upgraded from an HP Z24 which has been great; but the tearing…
Freesync alone was worth the price. Everything else is a bonus.
The notion of ‘VRR windows’
The notion of ‘VRR windows’ still seems silly to me. :-/
Well to a bunch of the
Well to a bunch of the creators they don’t see it as a money making spec right now…Still seems silly though.
It is what it is. Current
It is what it is. Current technology with standards integration of “freesync” can’t compete with Gsync’s superior ability to not have to deal with these windows.
Finnaly a Monitor that’s
Finnaly a Monitor that’s worth getting.
Freesync or not, it is a 144Hz IPS for non Freesync supporting cards. Compared that to my current 60hz IPS, hmm.. dreamy.
For Freesync cards, 35 to 90. Ohh I want. Though I hope that with time the sub 35Hz side gets additional GPU support like frame doubling to allow the screen to still run in Freesync. From what I read on various sites that is what Nvidia’s G-Sync dones (correct me if I’m incorrect on that)
Still I’d gladly upgrade my non A-sync 60hz IPS to a A-Sync 35-90hz or 144hz (for non a-sync GPUs) ISP.
If I got used to the 60Hz statick, I don’t see any problems in getting used to additional 30Hz at peak 😉
Now to save up them Zolty (I live in Poland :P)
I want a 34 inch version.
I want a 34 inch version.
35Hz is good, but not good
35Hz is good, but not good enough yet for me personally.
I need a panel that would be able to go to 28~30Hz AT THE VERY LEAST while also being an IPS/PLS screen type and having either thin-bezel or outright bezel-less design akin to that of the Dell’s U2414H/U2515H/U2715H monitors. Such FreeSync monitor would be simply just perfect for me personally, even if it will be just 1080p 24″/27″. This here MG279Q is not quite there yet, is what I’m saying. It’s close, but not there yet. Maybe two more years, or so.
Even people with AMD Graphics
Even people with AMD Graphics cards that I know (7950, 7970, R9 290) still haven’t bought a freesync monitor. So far one said there isn’t a good one out yet that he wants and this one did not meet his requirements, the VRR windows is too small still and also wants up to 144Hz VRR.
Not to mention the ghosting &
Not to mention the ghosting & overdrive issues still present….
The truth is that both GSync
The truth is that both GSync and Freesync are still immature technologies. But this monitor shows how far they have come in a year.
I want my gaming monitor to support the highest possible frame rate with least possible ghosting and overdrive issues.
I will start building my new gaming system at Xmas. I have been a green team member for years, but I still have an old AMD system somewhere.
My one complaint is that we now have to chose to match monitors to GPUs. That sort of lock (NVidia!) is not acceptable
I have been using the ACER
I have been using the ACER Predator XB270HU and it’s been fantastic. No immature tech here.
I have been playing even more since I have had this thing. Not a single flaw when gaming or general PC use. Best experience since 120-144HZ monitor were first introduced!
Honestly I don’t care so much
Honestly I don’t care so much about the VRR window size but look at page 3, that ghosting is not acceptable for me, have you ever tried an ULMB display?
I tend to change monitor less often than graphics cards, that’s why I waited some time before switching from CRT to LCD, the monitor needs to be overall better than the one I’m replacing and not half better and half worst.
It’s nice to see AMD improving its solution but cannot overlook that NVIDIA found a solution and solved those issue before commercializing G-Sync.
Definitely agree with you!
Definitely agree with you!
Does freesync support
Does freesync support borderless windowed mode gaming or is that only supported by gsync for now?
That’s G-Sync only (and very
That's G-Sync only (and very recent at that).
Very petty of me, but I wish
Very petty of me, but I wish it had the anodized aluminum headset holder like the…Acer was it? Thought that was a really cool feature because my $300 headphones should not be on the desk(and should have came with a stand).
That was the BenQ.
That was the BenQ.
Great review guys.
When will
Great review guys.
When will we get the AOC G2460PF review?
Got this monitor last night.
Got this monitor last night. Newegg shipped my 4-7 day shipping item in 1 day.
So far it is blowing my freakin mind!! I had the ASUS vg236h 23 inch 1080p TN 3D monitor from 2010 and went to this and I couldn’t go to sleep last night. Played Witcher 3 at 1440p max settings with my R9 290 and hovered around 37-45 fps the whole time and I get the same smoothness now with hairworks on that I got before with hairworks off(that put me at 60fps with it off at 1080p).
Only thing I wish I could do is switch between freesync on and off with the push of 1 button or when I exit a fullscreen game or something so it goes back to 144hz.
Can’t you just use FRTC
Can’t you just use FRTC (Frame Rate Target Control) of the AMD drivers and set it to just under 90 so that you never go over 90 and therefore remain within the freesync range of this monitor ?
These damn prices. Looks like
These damn prices. Looks like I’m skipping freesync