Color Accuracy and Brightness
Color Accuracy
For the most part, PC displays have been stuck in the sRGB color space for a very long time. While there have been some outliers supporting wider volume standards like AdobeRGB, those are traditionally targeted towards professions, and likewise very expensive. Testing for adherence to the sRGB color space is a well-understood methodology, and achievable with commodity hardware like the X-Rite i1 Display Pro, for reasonable prices.
However, with the advent of HDR, wide color gamut displays are becoming much more common. This is a great thing for consumers, who will now get to experience the benefits of these displays. Unfortunately, these displays are much more difficult to test, for a multitude of reasons.
In preparation for the release of HDR displays, we got our hands on a SpectraCal C6 HDR2000 colorimeter. For those of you unfamiliar with SpectaCal, they are the makers of CalMAN, one of the industry standard software for display analysis and calibration. With support for luminance accuracy up to 2000 nits, as well as NIST certified accuracy, it seemed like this would be the hardware for the job, at just $800.
However, after spending some time with CalMAN on the PG27UQ, we've learned that it's not quite that easy. After getting results of around 75% DCI-P3 coverage (as opposed to the claimed 97%), we decided to contact both ASUS and NVIDIA for feedback on our testing methodologies.
Technical marketing representatives from NVIDIA shared data that they gathered with one of these displays across a range of different colorimeters and Spectroradiometers. The conclusion they came to was that the only device capable of measuring the color accuracy to the same degree as the factory calibration of these displays is the Klein K-10A colorimeter, an almost $7000 device.
While we are willing to invest in testing equipment to accurately evaluate the claims of manufactures, $7000 is out of reason for this purpose.
We're still talking to NVIDIA about different solutions that could help us provide an accurate picture of color representation, potentially by combining both a colorimeter and a Spectrophotometer, we are currently not comfortable enough in our equipment to validate and claims made by ASUS and NVIDIA of the color accuracy of the display in either direction. We hope to have some more data on this in the future.
Backlight Testing
While our SpectraCal C6 HDR2000 might not be able to provide a complete picture of the color accuracy of the PG27UQ, it can still be used to evaluate the Luminance and Contrast of the display, two important claims with HDR monitors.
Here we can see the backlight's luminance while different percentages of the screen are sent a "pure white" signal in HDR mode. Everything but the 100% screen coverage reaches over 100 nits. Remember, DisplayHDR 1000 means the display is capable of "flashing" the full screen white to a level of 1000 nits. Our test measured for a sample of 1 second, which is a longer period than a "flash".
Backlight consistency is great, with a 10% patch reaching an average of 1200 nits, above the 1000 nits rating, after a few seconds of sustained operation.
One of the concerns of a FALD backlight array is the idea of a "halo" around content that is otherwise on a black screen. This does happen to some extent on the PG27UQ but was fairly rare in our testing.
The most reproducible scenario, in fact, was OSD elements, displaying on a black background while the monitor was booting or displaying a message. While in desktop use and while gaming, we didn't notice the halo effect unless it was a completely black screen with a completely white element overtop of it.
It’s not even real 10bit I
It’s not even real 10bit I believe it is a 8+2 which is plain stupid.
as soon as I read active fan,
as soon as I read active fan, I switched off. $ 2000 for an active fan monitor? No thanks very much.
Same here…not a big fan of
Same here…not a big fan of the “Fan”….no pun intended
Curious as to how the desktop
Curious as to how the desktop at SDR is handled when displaying content in HDR in a window. The rest of the desktop getting dimmer would seem to be the expected result. SDR is supposed to specify a range of 0 to 100 nits while HDR can specify a range of 1000 nits, 4000 nits, or more. When you switch the display to some HDR mode, anything that is SDR will have to be converted to HDR in some manner. If you just convert it into 0 to 100 of a 1000 nit range, it will look very dim. Is there a shift in the whole screen when you start an HDR video? You can’t blow up the SDR content to the full HDR range since it would probably cause significant banding.
Windows has a slider that
Windows has a slider that effectively sets the SDR content peak brightness level. By default it’s at 0 which looks like 100 nits to my eyes.
However it does not perform gamma correction and the SDR content is incorrected displayed on the display’s PQ gamma curve.
Curious as to how the desktop
Curious as to how the desktop at SDR is handled when displaying content in HDR in a window.
Monitors don’t work the way you think they do.
SDR doesn’t define any brightness range at all. The brightest 8-bit value (255,255,255) simply means “brightest”, whereas what that actually means in terms of luminance depends on the brightness setting of your monitor. Nothing more.
HDR values map to actual luminance levels, so the brightness setting on the monitor is typically deactivated/irrelevant. You only get HDR in full screen mode. In a window on the Windows desktop you can only ever get an SDR image, not least because consumer grade GPU’s refuse to deliver a 10-bit signal unless forced into full screen HDR mode.
So If we are fine with 98Hz
So If we are fine with 98Hz is this then going to be able to deliver the colour accuracy and contrast as advertised? I want one as Im an artist that does gaming and wants the best of both without the need for seperate displays.
Good first impressions. Now,
Good first impressions. Now, please, send the monitor to Rtings for the review.
What’s the minimum refresh
What’s the minimum refresh rate for GSync?
How quickly we forget. I
How quickly we forget. I remember buying the 24 inch SONY CRT monitor GDM-FW900 for $2 grand and it weighed a ton. Best CRT Trinitron monitor I ever had. Now I have the ASUS PG279Q monitor and I love it but it was also not cheap. I would get this monitor but even my EVGA GTX 1080 Ti FTW3 would not be able to do it justice. I suppose I should get a 4Kish TV first, heh.
Your comment about games not
Your comment about games not being able to push UHD / 4K at 144 Hz is far from correct. You are forgetting about older games. For instance, the original Far Cry is still a great game and runs at 180 fps at 4k.
However, it’s clear that Displayport 1.4 has insufficient bandwidth, so it’s probably worth waiting for HDMI 2.1, which should, and the refreshed monitor and new GPU to run it.
In what universe does a 27
In what universe does a 27 inch LCD with bad backlighting compare to a 55 inch LG OLED? I have a 1080 Ti and I wouldn’t buy this with YOUR money.
This one? I have both of
This one? I have both of these things and there’s no comparison whatsoever. In fact, I was the first consumer in the US to purchase and review LG’s 4K OLED when they went on sale in 2015.
You need to treat a 55 inch LG OLED like a plasma, which means even in an ideal light controlled environment, the damn thing’s going to dim itself noticeably in a typical PC color scheme with white dominating everything.
So on DisplayPort 1.4 it’s
So on DisplayPort 1.4 it’s 3840×2160+10bit+HDR+4:4:4 @98 Hz,
how high will the upcoming ultra-wide models go at 3440×1440+10bit+HDR+4:4:4?
Not sure if I did the math correctly but shouldn’t it be 165 Hz (panel’s max is 200 Hz according to announcements)?
You are correct that scaling
You are correct that scaling is proportional to the pixel count assuming everything else is identical.
Also, I still haven’t been given an answer about DSC which allows visually lossless compression and if the monitor and GPU support it you can actually achieve 4K@240Hz.
But the chart (about halfway down in link below) also shows 4K@120Hz using 4:4:4 without DSC so why is this monitor only at 98Hz?
Something doesn’t add up.
https://en.wikipedia.org/wiki/DisplayPort
Update: I’m an IDIOT. That’s 4:4:4 at 8-bit, not 10-bit.
These are the numbers using
These are the numbers using exact bandwidth calculations:
10-bit = 159 Hz
8-bit = 198 Hz
Coming from 1080P TN Monitor,
Coming from 1080P TN Monitor, should I buy 4k 144hz IPS Monitor?
I actually own a pg27uq and
I actually own a pg27uq and it is BEAUTIFUL BUT… I have noticed problems Running a 3930k@4.8ghz, 32GB 1866mhz cas9, 2x titan x (Pascal) window 7pro 64 on samsung 840 with latest nvidia drivers, and window 10 pro 64 on samsung 850 evo also latest drivers. Now with that out of the way the monitor seems to cause tons of microstutter in games even with sli disabled I get serious microstutter even in titles like destiny 2 that ran fine on my XB280hk 4k 60hz I am getting 90-120 fps in destiny but sometimes it goes to stutter mode and even the menus it stutters and the only solution is at tab, although enter repeatedly and hope or closing and reopening and hoping, and in games like fallout 4 that only go to 60 fps the games like this become a horrible unplayable stuttery mess with momentary stalls followed by black screens the gameplay resumes and there is no fixing even lowering the monitor to 60hz just makes it worse has anyone else experienced this?
Do you have GSYNC enabled or
Do you have GSYNC enabled or not?
If NO turn it on and if YES try disabling it temporarily… possibly try GSYNC OFF but force Adaptive VSync Half Refresh (which should cap to 72FPS VSYNC ON but disable VSYNC if you can’t output 72FPS).
Can you force an FPS cap using NVInspector or some other tool?
I assume GSYNC is ON and you obviously don’t want it off but for now maybe try the different options so you can at least narrow down the problem.
Maybe its an NVidia driver issue, but a quick Google doesn’t show much so maybe its simply early adopter blues that will get sorted out.
Also, maybe try shutting down and physically removing a video card. I know SLI is supposedly off in software but it’s all I can think of if the obvious isn’t helping.
Plus of course contacting the monitor manufacturer but I do think it’s a software issue.
I really hope we start seeing
I really hope we start seeing more size and resolution options; 1440p at 27 and 32 inches would be a great start. I’d also love to see a 35 or so inch 4k model.
Great review! Considering
Great review! Considering that it has an active fan, I am curious what the power consumption is under different settings/scenarios. Does the power consumption vary with refresh rate,GSYNC,HDR, etc? The specs say max 180W, which is a *lot* for a monitor. It would be great to update the review with this info. Thanks.