Gaming Impressions, Pricing, Conclusions
Though I find myself being overly repetitive in these sections of our monitor reviews, it's obvious that some discussion about gaming on a gaming-specific monitor is necessary. The ASUS MG278Q is based on AMD's FreeSync technology, part of the DisplayPort 1.2a+ specification known as Adaptive Sync, and in my testing, the technology works very well.
I was able to play all of our normal PC gaming titles on the 2560×1440 resolution that the MG278Q provides without an issue using a R9 390X as the rendering source. If you are a stalwart 1080p user and considering upgrading to a higher resolution monitor, keep in mind that you will likely need more GPU horsepower to continue playing your games at the same image quality settings that you are used to. Anyone with a R9 380X or higher should be able to use the MG278Q without issue, and even if you do hit lower than expected frame rates in certain games, the addition of FreeSync technology prevents it from being a horrible experience.
Not only does FreeSync in the MG278Q support variable refresh rates of 42-144Hz out of the box, thanks to the latest AMD software update it can now support frame doubling to allow for smooth, variable refresh rates under 42 FPS as well. This is a big change for AMD and for its monitor partners as it means that FreeSync is now very close to matching the quality and experience that you get from G-Sync.
For our non-gaming usage, the MG278Q is an adequate solution. The 144Hz refresh rate for Windows means that mouse movement, video animation, etc. are incredibly smooth and have essentially zero blurring. However, the TN panel does come with the caveat of viewing angles and color shift from off-axis viewing, as we showed you on the previous page. If you are gamer that uses your computer for school work and web browsing when Steam is closed, the MG278Q will still get the job done. If you are looking for a color-accurate display for anything beyond that, you'll want to look into the various IPS-based options, like the ASUS MG279Q.
Pricing and Availability
As of this writing, the ASUS MG278Q sells on Newegg.com for $399 and is out of stock on Amazon.com, but sells through third parties for $475 or so. At $399, the ASUS MG278Q is nearly $200 less expensive than the MG279Q, the IPS-based 144Hz FreeSync panel also from ASUS that shares similar build quality and features. The ROG Swift PG278Q is still in the $500+ range when you can find it as well, and it represents the closest competition from ASUS on the G-Sync side of the debate.
For the price, there are other options that compete with the package that the ASUS MG278Q provides. The Acer equivalent display is about $70 more expensive on Newegg but doesn't have the polish that the latest ASUS monitors include. If you are just after FreeSync support you can get displays for less money, but you'll be reverting back to the 1920×1080 resolution – and I think that once you move to a 2560×1440 display there is no turning back.
Closing Thoughts
ASUS has pretty much covered all of its bases for variable refresh rate monitors. If you want a low cost FreeSync display with a TN screen they have it with the MG278Q. If are willing to pay more for the upgrade to an IPS screen with slightly slower response times they have the MG279Q. And if you are a GeForce user rather than a Radeon user, the ROG Swift line has you covered in both of those areas as well. I expect we'll even see these options expanded upon with CES just around the corner in January of 2016. (Though, to be fair, even if we see new monitors at CES the availability of them could be much further out…)
If you are or plan to become an AMD Radeon GPU owner in the near future then there are few monitors that truly meet all of the qualifications the ASUS MG278Q does. For under $400 you can move into the world of variable refresh rate gaming, one that I think offers more advantages than nearly any other shift in the gaming market place in the past few years. AMD and the Radeon Technologies Group seems more dedicated to FreeSync than ever after our talks with them last month and the addition of Low Frame Rate Compensation makes the MG278Q an even better product. With a large 27-in screen and a 2560×1440 resolution you have all the makings of a great PC gaming experience. The only drawback is the TN screen; that is the trade off for the price advantage the MG278Q provides.
I know Nvidia has 70+% market
I know Nvidia has 70+% market share in the discrete gpu market. But they will have to come up with something really great if they want to keep a 200$ Premium on their monitors. Or (and this would be ideal) cut the GSync crap and let’s all work on adaptive-sync implementations (will never happen, I know)
They have 80%+ and the more
They have 80%+ and the more they sh1t their customers, the more people buy their products. So don’t expect them to cut the crap. Expect them to get worst.
They sell partially on having
They sell partially on having a perception as a high-end brand, so even if AMD has a much better value in the buyer’s price range, many of them will still buy Nvidia. They can sell g-sync based on the same type of marketing even though free sync is almost exactly equivalent now.
NVidia doesn’t want a premium
NVidia doesn’t want a premium on their monitors. They don’t sell monitors, only the GSync module which is currently expensive.
NVidia wants to see monitors sold to drive GPU adoption so ideally they would want GSync monitors to be cheaper than Freesync.
Since the GSync module replaces the “guts” (scaler etc) of the monitor they can do things that Freesync can not likely.
At some point monitor manufacturers will have to add in newer features like light strobe whilst asynchronous mode is active so if GSync ends up making that natively supported but Freesync requires the manufacturer themselves to implement the feature we could see Freesync costing more than GSync or simply not offering certain features.
The g sync parts are not
The g sync parts are not worth shit the charge Royalty fees to manufacturers to use them.
Another asus monitor with
Another asus monitor with aggressive matte coating.
Have balls release glossy panel
Do you really want that? If
Do you really want that? If so, why?
I’m not the person you’re
I’m not the person you’re quoting, but I prefer glossy because it looks better in a good (dark) environment and doesn’t have the blurring that many aggressive coatings have. Matte may look slightly less atrocious under direct light, but every display looks so terrible in those conditions that anyone who cares in the slightest about image quality will make sure to avoid direct light anyway.
That was my comment.
Colors
That was my comment.
Colors look ten times better without.
I can tolerate IPS lighter matte that acer/asus use on thier ips gsync but the film on these TN too heavy TN colors look bad enough dont need aggressive matte on top of that.
Also not a single glossy display for gaming it would be nice not to be all one sided.
We have control over lights in room we have no control when they slap that ugly coating on panel.
Even tablets/phones and TV dont use it anymore for obvious reason.
edit
*Also not a single
edit
*Also not a single glossy display*
Not a single glossy gsync/freesync display
I was under the impression.
I was under the impression. That most TVs have anti-reflective coatings, at a minimum.
There are glossy monitors
There are glossy monitors with anti-reflective coatings. The terminology can get confusing though.
“thanks to the latest AMD
“thanks to the latest AMD software update it can now support frame doubling to allow for smooth, variable refresh rates under 42 FPS as well”
This is great news for Adaptive Sync. Now we just need to get NVIDIA to support Adaptive Sync monitors too. Adaptive Sync monitors outnumber G-Sync about 3:1 in terms of number of models available worldwide. What did people think would happen with a free vs. expensive implementation?
G-Sync, it’s been nice knowing you.
^ and I mean that sincerely
^ and I mean that sincerely as an ASUS VG24QE owner.
AMD’s frame doubling still
AMD’s frame doubling still requires a 2.5X ratio minimum (i.e. 30Hz to 75Hz, but NOT 30Hz to 60Hz). This is due to the frame creation being driver (software) driven.
People keep saying NVidia’s GSync will disappear, but keep in mind that the cost of the GSync module will continue to drop, and the fact that they are working on newer versions that will do things that Freesync simply can’t (like light strobing whilst asynchronous mode is active).
Sure, AMD Freesync monitors can have features added like I just mentioned but that would be done by the monitor manufacturer which will cost them money thus be reflected in more expensive monitors.
Conversely, monitor manufacturers will be able to incorporate features that GSync already supports. So the cost of the GSync module will shrink but the features offered will increase.
So it’s not quite as simple as “free vs expensive” if you understand the monitor market.
(We wouldn’t even have Freesync if NVidia had not created GSync… and NVidia had no incentive to do that without making it proprietary so why hate on NVidia? It may not be ideal, but at least we have asynchronous monitors for smooth gaming and that’s awesome.)
Hello its not the parts its a
Hello its not the parts its a fee the manufactures pay to Nvidia.
Hey Ryan/PCPer
Why not test
Hey Ryan/PCPer
Why not test the Range for every VRR-Monitor, not just FreeSync.
It would be interesting to know how low the PG279Q panel can be driven as well as the other x-Sync monitors to see whether the module gives an advantage or not. Afaik, you only did that for the original (TN) ROG Swift and then stopped. 🙁
If the frame rate
If the frame rate multiplication works as it should, then there isn’t much to report. I would hope that they actually verified that it is working correctly.
It wouldn’t matter. G-Sync,
It wouldn’t matter. G-Sync, FreeSync, Adaptive Sync or whatever Sync, if you are at 10-15 fps, the motion will NOT be fluent.
It may not matter to you. It
It may not matter to you. It surely matters to Ryan, since he tested it and chose to mention it in the review.
And I’m not speaking about 15 fps/Hz like you tried to insinuate but rather the range down to 30 fps/Hz, or even 23.976 fps/Hz.
It’s simply a matter of curiosity. Does the G-Sync module allow the OEM/Nvidia to control the VRR range better or not.
Is the module needed to drive those VRR ranges beyond 144Hz to 165Hz and 200Hz… or not.
Does the panel behave different when controlled by the G-Sync module or a normal scaler with Adaptive Sync capabilities.
Are there differences between G-Sync and FreeSync monitors using the same or a similar panel beyond software/hardware features by the IHVs.
If you are getting at least a
If you are getting at least a minimum of 24fps from your card, then GSync or the LFC mode of FreeSync, will double the frame rate and you only need the monitor to support a minimum limit of 48Hz adaptive sync. Almost every monitor does that today. The only problem is with the monitors that don’t have an at least double upper limit compared to the low limit to support LFC. With those monitors the best option is to lower the graphics settings and get a good frame rate.
So I wasn’t trying to insinuate anything, but only guessing that, wanting to go as low as possible, would only have a meaning if you where thinking frame drops down to 10-15fps. In that case I don’t think Adaptive Sync of any kind would manage to magically create a fluid motion.
Now, comparing GSync and FreeSync when those two techs come with $100-$200 dollars difference in their price was never in my opinion valid. The difference in pricing is huge. Even without LFC, FreeSync was a better option because those $200, in the case of the more expensive monitors, can buy you a better card. You can go from a 390X to a Fury X. Even those $100 in the case of cheaper monitors can move you from a 380 to an 8GB 390. So, even if GSync gets more points, the fact is that it’s very bad VFM and an option only if you already payed for a 980Ti or a Titan.
In the end, every reader in PCPer could be curious for a number of things. Even a 50 page review would leave something out.
You don’t have a clear
You don’t have a clear understanding of how Frame Doubling via AMD’s software implementation works. There’s a good a article at PCPER here you can find relating to the Crimson drivers.
First, to work properly you need a range of at least 2.5X (i.e. 30Hz to 75Hz).
Secondly, the AMD driver is simply telling the monitor to repeat the same frame so that it stays in its supported asynchronous range. If we talk FPS (not frame times) then if the GPU outputs 16FPS normally then it’s told to REPEAT each frame so we get 32FPS (32 refreshes) on the physical monitor.
So it’s still effectively 16FPS in terms of new content on the screen, but it is a lot smoother than normal 16FPS since we are again staying in asynchronous mode.
(I don’t know where you came up with the “48Hz” value either. If it was a 30Hz to 60Hz monitor for example then any FPS the GPU is generating within that range keeps you in asynchronous mode. Frame doubling is ONLY required when your GPU can’t output at least 30FPS.)
I assume we need “2.5X” not 2X due to some latency in the driver software.
If not clear, it’s a doubling
If not clear, it’s a doubling on the computer side such as 16×2=32FPS (output of GPU). So, AMD handles this in its AMD driver software (which works pretty well), however NVidia has a lookaside buffer in the GSYNC module to handle this.
(it’s actually not a big deal, but perhaps AMD can stop talking about NVidia’s supposed latency due to the GSync module since AMD’s solution requires a driver fix which is obviously not instantaneous.. in fact it’s why the range of the monitor needs to be “2.5X” such as 30Hz to 75Hz which is a big problem for monitors like 4K, 30Hz to 60Hz asynch range).
Okay, I see where you got
Okay, I see where you got “48Hz” from. Maybe you understand clearly, or maybe not but note the “2.5X” range for proper support.
Again though, as per my other comment note that many games drop to very low FPS values for short periods of time so staying in asynchronous mode is especially important here.
It’s not about “magically” creating fluid motion but rather getting the smoothest motion possible. If we normally got about 15FPS but where in VSYNC mode (to avoid tearing) the screen stuttering would likely be horrible. Low FPS + stutter + buffer latency is arguably the worst-case scenario for gaming.
On the other hand 15FPS in asynchronous mode is much, much better.
(I think you might be surprised how often the average game drops below 30FPS. In some cases you need to crank up the average FPS by at least 4X to avoid these drops completely which then obviously has a big visual trade-off. Very few people seem to understand how much these drops affect perceived smoothness.)
Sudden DROPS to a low FPS
Sudden DROPS to a low FPS range is where asynchronous support makes the biggest difference.
You wouldn’t want to game around 15FPS all the time, but it’s still very common to have sudden drops so anything that improves that is important.
even if the ratio of freesync
even if the ratio of freesync to gsync goes up, which I suspect it will just on ease to implament, what value does NVidia get from supporting free sync. By supporting their own vfr standard they get licensing revenue and are able to co market there vid cards with monitors. The only way I see NVidia getting on board is if free sync out evolves gsync thus making gsync a marketing liability.
G-Sync is like PhisX Nvidia
G-Sync is like PhisX Nvidia will ride this all the way to obscurity. They will not drop it, i think the market will mostly shift away from it due to ease of implementation of Adaptive Sync/Free Sync.
Nvidia seems to have put
Nvidia seems to have put research into how to implement variable refresh without changing the standard but the proper way to do it was to just change the standard. The only advantage right now seems to be slightly better overdrive, but it is unclear how much of this is better overdrive calculations and how much is choice of panels. I don’t think the slightly increased ghosting on free sync panels is going to be noticeable under normal use so the price premium on a g-sync panel is not worth it. For the Nvidia fans, I guess they just have to pay the price premium. Nvidia will probably continue to deny free sync support for a while and then quietly as possible start supporting free sync; maybe they will try to come up with some new feature on top of free sync to differentiate.
Variable Refresh was already
Variable Refresh was already a part of VESA Embedded Displayport 1.3 (eDP) in form of the Panel Self Refresh (PSR) spec. Nvidia is a member of VESA. They could have easily suggested VRR to be implemented in the DisplayPort Spec. AMD did just that. They took the PSR spec and suggested it to be enhanced and implemented as Adaptive Sync. Now it’s part of both Specs and Nvidia actually benefits from it by using it to enable Mobile G-Sync on Laptops through the eDP Standard.
Without eDP and PSR there would be no Mobile G-Sync as we know it. And OEMs have no desire to put a huge FPGA inside their ever so smaller Laptops.
Here a happy owner of a
Here a happy owner of a MG278Q for two months now. I have it connected to a R9 390.
AMD has implemented frame rate target control (FRTC) that prevents the GPU from exceeding certain fps of your choice. So, if you wish, you can play on moderate fps (reducing noise and consumption) by activating FRTC.
On the other hand, variable rate control cames handy when you prefer to “unchain” the R9 390. This card, by the way, is a truly gem for 2560×1440 moreover when you combine it with a freesync monitor.
On other aspects: good enough colors, excellent resolution (much better than FHD without entering on high DPI problems I’ve suffered on 2160p screens) and better blacks than what I expected.
Viewing angles on this PC monitor is really not a concern (and I don’t support it at all in a TV).
The major disadvantage: it isn’t cheap.
>End of 2015
>TN
INTO THE
>End of 2015
>TN
INTO THE TRASH!
Who are you quoting?
Who are you quoting?
My thoughts and feels.
My thoughts and feels.
TN panels have continued to
TN panels have continued to improve and top-end TN panels have superior response times to minimize ghosting which is important to some people.
Ignoring price, IPS (or similar) has to be superior in all aspects to be a complete drop-in replacement for TN.
Back to price, not everybody can afford IPS especially for the cheaper monitors. Technology like this is always slowly phased out.
I cannot in good conscience
I cannot in good conscience pay 200 bucks for variable refresh rate from team green. Seems as though AMD is on the right track. If AMDs next GPU cycle is impressive, I know which direction I’ll be leaning.
Nice review, I was hoping
Nice review, I was hoping you’d pull out the old oscilliscope for below VRR window testing coz the last video with it was quite informative and fascinating.
I would like to see that
I would like to see that also. It would be interesting to see how both implementations handle the transition from VRR within the displays window and crossing the boundry into frame multiplication. Although, just using it for a while and seeing if they notice anything is a probably the most interesting test. If they played at the boundry, they would have noticed if crossing the boundry was causing any artifacts like stuttering, judder, or tearing. Tearing really should not occur though.
Its like everything Nvidia
Its like everything Nvidia does they create a closed standard on a open standard system then charge through the roof for it.
I purchased their exact
I purchased their exact monitor Asus mg278q and I can’t get it to display 144hz. I can @120 is the highest it will let me go I have used the supplied DisplayPort cable and the dvi cable still can’t get it to display even in battlefield 4 my highest option is 2560×1440. 120hz so please help I have the asrock gaming fitality z97 mobo , i7 4790k, 16g ddr3 and 2 gtx 660 sc in SLI
GTX 6xx gpu doesn’t support
GTX 6xx gpu doesn’t support that range with displayport you are locked to 120. Not sure but you should get 144hz via DVI-D port.
Display Port v1.2 supports
Display Port v1.2 supports 144Hz.
FPS mode is wrong! Is Racing
FPS mode is wrong! Is Racing Mode!
I just bought this monitor.
I just bought this monitor. The picture quality is stunning. I will try out this recommended Calibration settings. And yes its default is Racing mode, not FPS mode.
Going from a 1080p FreeSync 75Hz to a 1440p FreeSync 144Hz only has one word. WOW
I also have this monitor and
I also have this monitor and also stuck at 120Hz with GTX770, so far a lot and even NVIDIA has been saying that GTX770 don’t support 144Hz, on the other hand, there’s people who use GTX770 that said they can get 144Hz on a different 144Hz monitor, so i’m still confused.
On my own journey I have found out some other “possibility” why I can’t get 144Hz, it is because I’m using DisplayPort 1.1. I always thought I’m using the “1.2” one since the OSD setting allow me to set which DP stream to use and it was on 1.2. ASUS support told me this, that the DP cable supplied with the monitor is a 1.1 cable therefore I can’t get 144Hz.
More Googling and a lot has been saying that DP versions don’t matter, but I think there’s more to this right? Reading more about DP it’s true some areas that mentioned versions don’t matter, but there’s a difference like bandwidth (HBR=10.80 Gbit/s, HBR2=21.60 Gbit/s) and as far as I know 2560×1440@144Hz needs HBR2 bandwidth, but I really don’t know.
Ryan/Pcper & owners of the MG278Q, are you using the DP cable supplied with the monitor and manage to get 144Hz? What GPU are you using?
Thanks, I may get some 1.2 cables and test it out, who knows maybe the real reason is GTX770…
Could you consider doing the
Could you consider doing the viewing angles a lot less extreme – like record a video from center 0 degrees and go out to max 15 degrees as that is the experience of sitting in front of it. I would like to know if I can perceive any changes just moving my head normally – The extreme angles you show is not use case based and we all know they will be horrible.