Gaming Impressions
(Hey everyone, Ryan stepping in here to talk through some gaming thoughts…)
Since the introduction of the first G-Sync monitor from NVIDIA and ASUS several years back, I’ve been a firm believer that variable refresh rates represent the best of PC gaming, and gaming in general. And then, since seeing the first HDR TVs on display just a few years back, I was also smitten. And thus began the quest to find the perfect monitor that blended BOTH of the technologies together in an impressive manner.
While Ken has done the vast majority of the testing and evaluation for the ASUS PG27UQ and the G-Sync HDR implementation, I wanted to spend some time just doing what the monitor was meant to do: game! So over the course of a few days, I played a handful of games and watched some videos to see what the overall experience was with this display.
Here’s the preview: it’s pretty amazing.
First up on the list was Far Cry 5, a modern game that was designed with HDR in mind. And even though it was an AMD-sponsored title, it worked flawlessly with the G-Sync HDR display.
(Taking screenshots of HDR games is…difficult. These are photos taken with an iPhone X.)
In the game, the effect and benefit of HDR is immediately available. It might be a bit overbearing at first, those bright lights are BRIGHT with the 1000 nits capability of the screen. The movement from an indoor building out to the forest creates a (artist created) blinding effect, that serves an interesting in-game purpose and also establishes a flagship opportunity to demo HDR in action.
The 384-zone integration on the ASUS PG27UQ creates the ability to have narrow locations of brightness without blowing out the rest of the screen. The best example of this is with the red-dot sight on your pistol: it remains BRIGHT red even when placed in a dark room, or in dim foliage. It presents the effect that feature has in the real world accurately.
(Taking screenshots of HDR games is…difficult. These are photos taken with an iPhone X.)
I also played a bit of Destiny 2, one of the first games to implement HDR correctly. The fantastical setting of the game allows for interesting dynamic lighting on characters, weapon effects, landscape, etc. The HDR settings itself are only presented in an on/off state, but the difference is noticeable immediately. The very opening of the game, where you are in a dark space with burning fires all around you, is an immediate “holy shit” moment for HDR.
One bad thing about the implementation in Destiny 2 – alt-tabbing out of the game and back in killed the HDR setting without restarting the game.
I decided to boot up the latest installment of Hitman, a game that claimed to have HDR support integrated very early, seemingly before there were monitors around to support it. It worked and required me to set exclusive full-screen mode to enable.
The interface to enable uses different language and settings, and the net result is a much more subdued effect. It’s there, and you can tell the difference when switching between them, but it doesn’t have the “smack you in the face” effect that both FC5 and Destiny 2 had on me.
YouTube and Netflix HDR playback both worked as well, and watching some of the Digital Foundry HDR captures from other games was the best example of seeing the in-browser integration. Chrome still has a bug in the latest public release where the rest of the window is dim (other than the HDR video that displays correctly), but apparently, that is already addressed in a future release.
In the end, the gaming experience on the ASUS PG27UQ with G-Sync HDR is presently unmatched. And it should be, for the price! I played exclusively in the 98 Hz mode, and though I think getting 144 Hz with the HDR and 4K resolution would be an improvement, I never truly felt hampered by the reduced monitor refresh rate.
I would like to see both a 2560×1440 27-in version as well as a larger 32-in screen if we are going to keep that 4K resolution. I think we’ll get there, for both variants, but with the shortage of 1000 nit capable displays and the high cost, it will take some time.
There is no getting around that $2000 price tag for this monitor, and I expect very few gamers willing to shell out that kind of cash. If you do though, you’ll be greeted with what I think is the BEST desktop gaming experience with the best HDR implementation on a monitor to date.
It’s not even real 10bit I
It’s not even real 10bit I believe it is a 8+2 which is plain stupid.
as soon as I read active fan,
as soon as I read active fan, I switched off. $ 2000 for an active fan monitor? No thanks very much.
Same here…not a big fan of
Same here…not a big fan of the “Fan”….no pun intended
Curious as to how the desktop
Curious as to how the desktop at SDR is handled when displaying content in HDR in a window. The rest of the desktop getting dimmer would seem to be the expected result. SDR is supposed to specify a range of 0 to 100 nits while HDR can specify a range of 1000 nits, 4000 nits, or more. When you switch the display to some HDR mode, anything that is SDR will have to be converted to HDR in some manner. If you just convert it into 0 to 100 of a 1000 nit range, it will look very dim. Is there a shift in the whole screen when you start an HDR video? You can’t blow up the SDR content to the full HDR range since it would probably cause significant banding.
Windows has a slider that
Windows has a slider that effectively sets the SDR content peak brightness level. By default it’s at 0 which looks like 100 nits to my eyes.
However it does not perform gamma correction and the SDR content is incorrected displayed on the display’s PQ gamma curve.
Curious as to how the desktop
Curious as to how the desktop at SDR is handled when displaying content in HDR in a window.
Monitors don’t work the way you think they do.
SDR doesn’t define any brightness range at all. The brightest 8-bit value (255,255,255) simply means “brightest”, whereas what that actually means in terms of luminance depends on the brightness setting of your monitor. Nothing more.
HDR values map to actual luminance levels, so the brightness setting on the monitor is typically deactivated/irrelevant. You only get HDR in full screen mode. In a window on the Windows desktop you can only ever get an SDR image, not least because consumer grade GPU’s refuse to deliver a 10-bit signal unless forced into full screen HDR mode.
So If we are fine with 98Hz
So If we are fine with 98Hz is this then going to be able to deliver the colour accuracy and contrast as advertised? I want one as Im an artist that does gaming and wants the best of both without the need for seperate displays.
Good first impressions. Now,
Good first impressions. Now, please, send the monitor to Rtings for the review.
What’s the minimum refresh
What’s the minimum refresh rate for GSync?
How quickly we forget. I
How quickly we forget. I remember buying the 24 inch SONY CRT monitor GDM-FW900 for $2 grand and it weighed a ton. Best CRT Trinitron monitor I ever had. Now I have the ASUS PG279Q monitor and I love it but it was also not cheap. I would get this monitor but even my EVGA GTX 1080 Ti FTW3 would not be able to do it justice. I suppose I should get a 4Kish TV first, heh.
Your comment about games not
Your comment about games not being able to push UHD / 4K at 144 Hz is far from correct. You are forgetting about older games. For instance, the original Far Cry is still a great game and runs at 180 fps at 4k.
However, it’s clear that Displayport 1.4 has insufficient bandwidth, so it’s probably worth waiting for HDMI 2.1, which should, and the refreshed monitor and new GPU to run it.
In what universe does a 27
In what universe does a 27 inch LCD with bad backlighting compare to a 55 inch LG OLED? I have a 1080 Ti and I wouldn’t buy this with YOUR money.
This one? I have both of
This one? I have both of these things and there’s no comparison whatsoever. In fact, I was the first consumer in the US to purchase and review LG’s 4K OLED when they went on sale in 2015.
You need to treat a 55 inch LG OLED like a plasma, which means even in an ideal light controlled environment, the damn thing’s going to dim itself noticeably in a typical PC color scheme with white dominating everything.
So on DisplayPort 1.4 it’s
So on DisplayPort 1.4 it’s 3840×2160+10bit+HDR+4:4:4 @98 Hz,
how high will the upcoming ultra-wide models go at 3440×1440+10bit+HDR+4:4:4?
Not sure if I did the math correctly but shouldn’t it be 165 Hz (panel’s max is 200 Hz according to announcements)?
You are correct that scaling
You are correct that scaling is proportional to the pixel count assuming everything else is identical.
Also, I still haven’t been given an answer about DSC which allows visually lossless compression and if the monitor and GPU support it you can actually achieve 4K@240Hz.
But the chart (about halfway down in link below) also shows 4K@120Hz using 4:4:4 without DSC so why is this monitor only at 98Hz?
Something doesn’t add up.
https://en.wikipedia.org/wiki/DisplayPort
Update: I’m an IDIOT. That’s 4:4:4 at 8-bit, not 10-bit.
These are the numbers using
These are the numbers using exact bandwidth calculations:
10-bit = 159 Hz
8-bit = 198 Hz
Coming from 1080P TN Monitor,
Coming from 1080P TN Monitor, should I buy 4k 144hz IPS Monitor?
I actually own a pg27uq and
I actually own a pg27uq and it is BEAUTIFUL BUT… I have noticed problems Running a 3930k@4.8ghz, 32GB 1866mhz cas9, 2x titan x (Pascal) window 7pro 64 on samsung 840 with latest nvidia drivers, and window 10 pro 64 on samsung 850 evo also latest drivers. Now with that out of the way the monitor seems to cause tons of microstutter in games even with sli disabled I get serious microstutter even in titles like destiny 2 that ran fine on my XB280hk 4k 60hz I am getting 90-120 fps in destiny but sometimes it goes to stutter mode and even the menus it stutters and the only solution is at tab, although enter repeatedly and hope or closing and reopening and hoping, and in games like fallout 4 that only go to 60 fps the games like this become a horrible unplayable stuttery mess with momentary stalls followed by black screens the gameplay resumes and there is no fixing even lowering the monitor to 60hz just makes it worse has anyone else experienced this?
Do you have GSYNC enabled or
Do you have GSYNC enabled or not?
If NO turn it on and if YES try disabling it temporarily… possibly try GSYNC OFF but force Adaptive VSync Half Refresh (which should cap to 72FPS VSYNC ON but disable VSYNC if you can’t output 72FPS).
Can you force an FPS cap using NVInspector or some other tool?
I assume GSYNC is ON and you obviously don’t want it off but for now maybe try the different options so you can at least narrow down the problem.
Maybe its an NVidia driver issue, but a quick Google doesn’t show much so maybe its simply early adopter blues that will get sorted out.
Also, maybe try shutting down and physically removing a video card. I know SLI is supposedly off in software but it’s all I can think of if the obvious isn’t helping.
Plus of course contacting the monitor manufacturer but I do think it’s a software issue.
I really hope we start seeing
I really hope we start seeing more size and resolution options; 1440p at 27 and 32 inches would be a great start. I’d also love to see a 35 or so inch 4k model.
Great review! Considering
Great review! Considering that it has an active fan, I am curious what the power consumption is under different settings/scenarios. Does the power consumption vary with refresh rate,GSYNC,HDR, etc? The specs say max 180W, which is a *lot* for a monitor. It would be great to update the review with this info. Thanks.