Calibration and Viewing Angles
Most monitor vendors today will advertise the color accuracy of their displays to some degree on marketing material, and the MG278Q is no different. On the landing page, on the ASUS website, the company calls out "75% color saturation" along with viewing angles and contrast ratios. But more often than not we find that the out of box experience with monitors can be quite poor, favoring heavy blues and oversaturated colors rather than actual color accuracy.
This screenshot above shows the output report from dispcalGUI on an uncalibrated MG278Q. The only change we made to the monitor before running the calibration was changing the color preset from the default and instead set it to sRGB. This should get the monitor pretty close to what ASUS believes is the calibrated state of the panel, in theory. But you can look at the uncalibrated report and see that this monitor was FAR from producing accurate color! The grays were off significantly as were the greens and blues.
After running through our hours-long calibration process, the monitor was able to move into a very accurate color space. Only a couple of hues remained outside of the acceptable range (light purple and dark purple, for example), but you would be hard pressed to find another TN panel running at 120Hz+ that exhibit these impressive calibration results.
This diagram shows the result in the typical XxY fashion – the color triangle represents the before-calibration result while the dashed line represents (close to) the final result.
Calibration Profile Download
The Windows color profile management interface is a bit of a mess, with the need to select and enable a profile in multiple layers of the interface. The best guide for loading and enabling a profile can be found over at TFTCentral. We used the following tools to generate our own calibration profile:
- Datacolor Spyder 4
- ArgyllCMS (calibration software suite)
- dispcalGUI (Graphical interface for Argyll CMS)
- HCFR (for additional verification and output graphs)
Our calibration profile was created using the lowest calibration speed in a dimly lit room. Here are the required settings if you wish to use our profile:
- FPS mode
- Brightness: 25
- Red: 89
- Green: 97
- Blue: 100
- Profile download: (HERE)
The above profile was created specifically for a color temperature target of 6500K at a luminance of 120 cd/m2 (nit). Gamma 2.2. Remember that the only way to get a correct calibration on your specific panel is by using a colorimeter on that very panel. The above settings and profile will only get *your* display to a perfect calibration if it has the exact same properties as our test sample. A perfect match is unlikely, but this should get you far closer to calibrated than just running with defaults.
Viewing Angles
The ASUS MG278Q is a TN panel, a good quality one, but it still exhibits the properties we see with nearly all TN panels when it comes to viewing angles. Straight on, the screen looks great but as we rotated or move around the screen, we see some pretty dramatic color shifting. Notice from the bottom view (looking up at the display), we have nearly 100% inversion.
Center
From Bottom
From Top
From Left
You can see that even from the middle of the screen you are getting some moderate color shift with the MG278Q as you look at the top or bottom of the 27-in panel. Gamers will likely not notice a big change in the game though designers and anyone doing color-based work (Photoshop, etc.) will have issues getting their eyes in the correct place to see the image as it was intended.
I know Nvidia has 70+% market
I know Nvidia has 70+% market share in the discrete gpu market. But they will have to come up with something really great if they want to keep a 200$ Premium on their monitors. Or (and this would be ideal) cut the GSync crap and let’s all work on adaptive-sync implementations (will never happen, I know)
They have 80%+ and the more
They have 80%+ and the more they sh1t their customers, the more people buy their products. So don’t expect them to cut the crap. Expect them to get worst.
They sell partially on having
They sell partially on having a perception as a high-end brand, so even if AMD has a much better value in the buyer’s price range, many of them will still buy Nvidia. They can sell g-sync based on the same type of marketing even though free sync is almost exactly equivalent now.
NVidia doesn’t want a premium
NVidia doesn’t want a premium on their monitors. They don’t sell monitors, only the GSync module which is currently expensive.
NVidia wants to see monitors sold to drive GPU adoption so ideally they would want GSync monitors to be cheaper than Freesync.
Since the GSync module replaces the “guts” (scaler etc) of the monitor they can do things that Freesync can not likely.
At some point monitor manufacturers will have to add in newer features like light strobe whilst asynchronous mode is active so if GSync ends up making that natively supported but Freesync requires the manufacturer themselves to implement the feature we could see Freesync costing more than GSync or simply not offering certain features.
The g sync parts are not
The g sync parts are not worth shit the charge Royalty fees to manufacturers to use them.
Another asus monitor with
Another asus monitor with aggressive matte coating.
Have balls release glossy panel
Do you really want that? If
Do you really want that? If so, why?
I’m not the person you’re
I’m not the person you’re quoting, but I prefer glossy because it looks better in a good (dark) environment and doesn’t have the blurring that many aggressive coatings have. Matte may look slightly less atrocious under direct light, but every display looks so terrible in those conditions that anyone who cares in the slightest about image quality will make sure to avoid direct light anyway.
That was my comment.
Colors
That was my comment.
Colors look ten times better without.
I can tolerate IPS lighter matte that acer/asus use on thier ips gsync but the film on these TN too heavy TN colors look bad enough dont need aggressive matte on top of that.
Also not a single glossy display for gaming it would be nice not to be all one sided.
We have control over lights in room we have no control when they slap that ugly coating on panel.
Even tablets/phones and TV dont use it anymore for obvious reason.
edit
*Also not a single
edit
*Also not a single glossy display*
Not a single glossy gsync/freesync display
I was under the impression.
I was under the impression. That most TVs have anti-reflective coatings, at a minimum.
There are glossy monitors
There are glossy monitors with anti-reflective coatings. The terminology can get confusing though.
“thanks to the latest AMD
“thanks to the latest AMD software update it can now support frame doubling to allow for smooth, variable refresh rates under 42 FPS as well”
This is great news for Adaptive Sync. Now we just need to get NVIDIA to support Adaptive Sync monitors too. Adaptive Sync monitors outnumber G-Sync about 3:1 in terms of number of models available worldwide. What did people think would happen with a free vs. expensive implementation?
G-Sync, it’s been nice knowing you.
^ and I mean that sincerely
^ and I mean that sincerely as an ASUS VG24QE owner.
AMD’s frame doubling still
AMD’s frame doubling still requires a 2.5X ratio minimum (i.e. 30Hz to 75Hz, but NOT 30Hz to 60Hz). This is due to the frame creation being driver (software) driven.
People keep saying NVidia’s GSync will disappear, but keep in mind that the cost of the GSync module will continue to drop, and the fact that they are working on newer versions that will do things that Freesync simply can’t (like light strobing whilst asynchronous mode is active).
Sure, AMD Freesync monitors can have features added like I just mentioned but that would be done by the monitor manufacturer which will cost them money thus be reflected in more expensive monitors.
Conversely, monitor manufacturers will be able to incorporate features that GSync already supports. So the cost of the GSync module will shrink but the features offered will increase.
So it’s not quite as simple as “free vs expensive” if you understand the monitor market.
(We wouldn’t even have Freesync if NVidia had not created GSync… and NVidia had no incentive to do that without making it proprietary so why hate on NVidia? It may not be ideal, but at least we have asynchronous monitors for smooth gaming and that’s awesome.)
Hello its not the parts its a
Hello its not the parts its a fee the manufactures pay to Nvidia.
Hey Ryan/PCPer
Why not test
Hey Ryan/PCPer
Why not test the Range for every VRR-Monitor, not just FreeSync.
It would be interesting to know how low the PG279Q panel can be driven as well as the other x-Sync monitors to see whether the module gives an advantage or not. Afaik, you only did that for the original (TN) ROG Swift and then stopped. 🙁
If the frame rate
If the frame rate multiplication works as it should, then there isn’t much to report. I would hope that they actually verified that it is working correctly.
It wouldn’t matter. G-Sync,
It wouldn’t matter. G-Sync, FreeSync, Adaptive Sync or whatever Sync, if you are at 10-15 fps, the motion will NOT be fluent.
It may not matter to you. It
It may not matter to you. It surely matters to Ryan, since he tested it and chose to mention it in the review.
And I’m not speaking about 15 fps/Hz like you tried to insinuate but rather the range down to 30 fps/Hz, or even 23.976 fps/Hz.
It’s simply a matter of curiosity. Does the G-Sync module allow the OEM/Nvidia to control the VRR range better or not.
Is the module needed to drive those VRR ranges beyond 144Hz to 165Hz and 200Hz… or not.
Does the panel behave different when controlled by the G-Sync module or a normal scaler with Adaptive Sync capabilities.
Are there differences between G-Sync and FreeSync monitors using the same or a similar panel beyond software/hardware features by the IHVs.
If you are getting at least a
If you are getting at least a minimum of 24fps from your card, then GSync or the LFC mode of FreeSync, will double the frame rate and you only need the monitor to support a minimum limit of 48Hz adaptive sync. Almost every monitor does that today. The only problem is with the monitors that don’t have an at least double upper limit compared to the low limit to support LFC. With those monitors the best option is to lower the graphics settings and get a good frame rate.
So I wasn’t trying to insinuate anything, but only guessing that, wanting to go as low as possible, would only have a meaning if you where thinking frame drops down to 10-15fps. In that case I don’t think Adaptive Sync of any kind would manage to magically create a fluid motion.
Now, comparing GSync and FreeSync when those two techs come with $100-$200 dollars difference in their price was never in my opinion valid. The difference in pricing is huge. Even without LFC, FreeSync was a better option because those $200, in the case of the more expensive monitors, can buy you a better card. You can go from a 390X to a Fury X. Even those $100 in the case of cheaper monitors can move you from a 380 to an 8GB 390. So, even if GSync gets more points, the fact is that it’s very bad VFM and an option only if you already payed for a 980Ti or a Titan.
In the end, every reader in PCPer could be curious for a number of things. Even a 50 page review would leave something out.
You don’t have a clear
You don’t have a clear understanding of how Frame Doubling via AMD’s software implementation works. There’s a good a article at PCPER here you can find relating to the Crimson drivers.
First, to work properly you need a range of at least 2.5X (i.e. 30Hz to 75Hz).
Secondly, the AMD driver is simply telling the monitor to repeat the same frame so that it stays in its supported asynchronous range. If we talk FPS (not frame times) then if the GPU outputs 16FPS normally then it’s told to REPEAT each frame so we get 32FPS (32 refreshes) on the physical monitor.
So it’s still effectively 16FPS in terms of new content on the screen, but it is a lot smoother than normal 16FPS since we are again staying in asynchronous mode.
(I don’t know where you came up with the “48Hz” value either. If it was a 30Hz to 60Hz monitor for example then any FPS the GPU is generating within that range keeps you in asynchronous mode. Frame doubling is ONLY required when your GPU can’t output at least 30FPS.)
I assume we need “2.5X” not 2X due to some latency in the driver software.
If not clear, it’s a doubling
If not clear, it’s a doubling on the computer side such as 16×2=32FPS (output of GPU). So, AMD handles this in its AMD driver software (which works pretty well), however NVidia has a lookaside buffer in the GSYNC module to handle this.
(it’s actually not a big deal, but perhaps AMD can stop talking about NVidia’s supposed latency due to the GSync module since AMD’s solution requires a driver fix which is obviously not instantaneous.. in fact it’s why the range of the monitor needs to be “2.5X” such as 30Hz to 75Hz which is a big problem for monitors like 4K, 30Hz to 60Hz asynch range).
Okay, I see where you got
Okay, I see where you got “48Hz” from. Maybe you understand clearly, or maybe not but note the “2.5X” range for proper support.
Again though, as per my other comment note that many games drop to very low FPS values for short periods of time so staying in asynchronous mode is especially important here.
It’s not about “magically” creating fluid motion but rather getting the smoothest motion possible. If we normally got about 15FPS but where in VSYNC mode (to avoid tearing) the screen stuttering would likely be horrible. Low FPS + stutter + buffer latency is arguably the worst-case scenario for gaming.
On the other hand 15FPS in asynchronous mode is much, much better.
(I think you might be surprised how often the average game drops below 30FPS. In some cases you need to crank up the average FPS by at least 4X to avoid these drops completely which then obviously has a big visual trade-off. Very few people seem to understand how much these drops affect perceived smoothness.)
Sudden DROPS to a low FPS
Sudden DROPS to a low FPS range is where asynchronous support makes the biggest difference.
You wouldn’t want to game around 15FPS all the time, but it’s still very common to have sudden drops so anything that improves that is important.
even if the ratio of freesync
even if the ratio of freesync to gsync goes up, which I suspect it will just on ease to implament, what value does NVidia get from supporting free sync. By supporting their own vfr standard they get licensing revenue and are able to co market there vid cards with monitors. The only way I see NVidia getting on board is if free sync out evolves gsync thus making gsync a marketing liability.
G-Sync is like PhisX Nvidia
G-Sync is like PhisX Nvidia will ride this all the way to obscurity. They will not drop it, i think the market will mostly shift away from it due to ease of implementation of Adaptive Sync/Free Sync.
Nvidia seems to have put
Nvidia seems to have put research into how to implement variable refresh without changing the standard but the proper way to do it was to just change the standard. The only advantage right now seems to be slightly better overdrive, but it is unclear how much of this is better overdrive calculations and how much is choice of panels. I don’t think the slightly increased ghosting on free sync panels is going to be noticeable under normal use so the price premium on a g-sync panel is not worth it. For the Nvidia fans, I guess they just have to pay the price premium. Nvidia will probably continue to deny free sync support for a while and then quietly as possible start supporting free sync; maybe they will try to come up with some new feature on top of free sync to differentiate.
Variable Refresh was already
Variable Refresh was already a part of VESA Embedded Displayport 1.3 (eDP) in form of the Panel Self Refresh (PSR) spec. Nvidia is a member of VESA. They could have easily suggested VRR to be implemented in the DisplayPort Spec. AMD did just that. They took the PSR spec and suggested it to be enhanced and implemented as Adaptive Sync. Now it’s part of both Specs and Nvidia actually benefits from it by using it to enable Mobile G-Sync on Laptops through the eDP Standard.
Without eDP and PSR there would be no Mobile G-Sync as we know it. And OEMs have no desire to put a huge FPGA inside their ever so smaller Laptops.
Here a happy owner of a
Here a happy owner of a MG278Q for two months now. I have it connected to a R9 390.
AMD has implemented frame rate target control (FRTC) that prevents the GPU from exceeding certain fps of your choice. So, if you wish, you can play on moderate fps (reducing noise and consumption) by activating FRTC.
On the other hand, variable rate control cames handy when you prefer to “unchain” the R9 390. This card, by the way, is a truly gem for 2560×1440 moreover when you combine it with a freesync monitor.
On other aspects: good enough colors, excellent resolution (much better than FHD without entering on high DPI problems I’ve suffered on 2160p screens) and better blacks than what I expected.
Viewing angles on this PC monitor is really not a concern (and I don’t support it at all in a TV).
The major disadvantage: it isn’t cheap.
>End of 2015
>TN
INTO THE
>End of 2015
>TN
INTO THE TRASH!
Who are you quoting?
Who are you quoting?
My thoughts and feels.
My thoughts and feels.
TN panels have continued to
TN panels have continued to improve and top-end TN panels have superior response times to minimize ghosting which is important to some people.
Ignoring price, IPS (or similar) has to be superior in all aspects to be a complete drop-in replacement for TN.
Back to price, not everybody can afford IPS especially for the cheaper monitors. Technology like this is always slowly phased out.
I cannot in good conscience
I cannot in good conscience pay 200 bucks for variable refresh rate from team green. Seems as though AMD is on the right track. If AMDs next GPU cycle is impressive, I know which direction I’ll be leaning.
Nice review, I was hoping
Nice review, I was hoping you’d pull out the old oscilliscope for below VRR window testing coz the last video with it was quite informative and fascinating.
I would like to see that
I would like to see that also. It would be interesting to see how both implementations handle the transition from VRR within the displays window and crossing the boundry into frame multiplication. Although, just using it for a while and seeing if they notice anything is a probably the most interesting test. If they played at the boundry, they would have noticed if crossing the boundry was causing any artifacts like stuttering, judder, or tearing. Tearing really should not occur though.
Its like everything Nvidia
Its like everything Nvidia does they create a closed standard on a open standard system then charge through the roof for it.
I purchased their exact
I purchased their exact monitor Asus mg278q and I can’t get it to display 144hz. I can @120 is the highest it will let me go I have used the supplied DisplayPort cable and the dvi cable still can’t get it to display even in battlefield 4 my highest option is 2560×1440. 120hz so please help I have the asrock gaming fitality z97 mobo , i7 4790k, 16g ddr3 and 2 gtx 660 sc in SLI
GTX 6xx gpu doesn’t support
GTX 6xx gpu doesn’t support that range with displayport you are locked to 120. Not sure but you should get 144hz via DVI-D port.
Display Port v1.2 supports
Display Port v1.2 supports 144Hz.
FPS mode is wrong! Is Racing
FPS mode is wrong! Is Racing Mode!
I just bought this monitor.
I just bought this monitor. The picture quality is stunning. I will try out this recommended Calibration settings. And yes its default is Racing mode, not FPS mode.
Going from a 1080p FreeSync 75Hz to a 1440p FreeSync 144Hz only has one word. WOW
I also have this monitor and
I also have this monitor and also stuck at 120Hz with GTX770, so far a lot and even NVIDIA has been saying that GTX770 don’t support 144Hz, on the other hand, there’s people who use GTX770 that said they can get 144Hz on a different 144Hz monitor, so i’m still confused.
On my own journey I have found out some other “possibility” why I can’t get 144Hz, it is because I’m using DisplayPort 1.1. I always thought I’m using the “1.2” one since the OSD setting allow me to set which DP stream to use and it was on 1.2. ASUS support told me this, that the DP cable supplied with the monitor is a 1.1 cable therefore I can’t get 144Hz.
More Googling and a lot has been saying that DP versions don’t matter, but I think there’s more to this right? Reading more about DP it’s true some areas that mentioned versions don’t matter, but there’s a difference like bandwidth (HBR=10.80 Gbit/s, HBR2=21.60 Gbit/s) and as far as I know 2560×1440@144Hz needs HBR2 bandwidth, but I really don’t know.
Ryan/Pcper & owners of the MG278Q, are you using the DP cable supplied with the monitor and manage to get 144Hz? What GPU are you using?
Thanks, I may get some 1.2 cables and test it out, who knows maybe the real reason is GTX770…
Could you consider doing the
Could you consider doing the viewing angles a lot less extreme – like record a video from center 0 degrees and go out to max 15 degrees as that is the experience of sitting in front of it. I would like to know if I can perceive any changes just moving my head normally – The extreme angles you show is not use case based and we all know they will be horrible.