Teardown and Hardware Analysis
Once we took off the stock monitor stand in order to investigate potential fan intake issues with the VESA mount, we couldn't help ourselves from continuing further to see what makes this display tick.
I'm actually impressed at how easily the PG27UQ came apart compared to some other displays we've taken apart. After removing the stand, a metal spudger was used to release the plastic clips holding the two pieces of the enclosure together. Taking care to remove the cables for the LEDs and controls on the back panel, we were left with access to the innards of the panel.
From the general layout, it appears there are two modules on the left and right side of the display that are likely controlling the 384-zone FALD backlight, which connects to PCB in the middle which the display outputs are attached to, and is responsible for interfacing directly to the LCD panel.
The LCD panel is the M270QAN02.2 from AUOptronics. While datasheets are available for similar models from AUO, this panel does not yet appear in databases like panelook.
After disconnecting the cables running to the PCB in the middle, and removing the bracket, we gained access to the electronics responsible for controlling the LCD panel itself.
A New G-SYNC Module
Now that we have a better view of the PCB, we can see exactly what the aforementioned blower fan and heatsink assembly are responsible for— the all-new G-SYNC module.
Over the years, there has been a lot of speculation about if/when NVIDIA would move from an FPGA solution to a cheaper, and smaller ASIC solution for controlling G-SYNC monitors. While extensible due to their programmability, FGPA's are generally significantly more expensive than ASICs and take up more physical space.
Removing the heatsink and thermal paste, we get our first peek at the G-SYNC module itself.
As it turns out, G-SYNC HDR, like its predecessor is powered by an FPGA from Altera. In this case, NVIDIA is using an Intel Altera Arria 10 GX 480 FPGA. Thanks to the extensive documentation from Intel, including a model number decoder, we are able to get some more information about this particular FPGA.
A mid-range option in the Arria 10 lineup, the GX480 provides 480,000 reprogrammable logic, as well as twenty-four 17.4 Gbps Transceivers for I/O. Important for this given application, the GX480 also supports 222 pairs of LVDS I/O.
DRAM from Micron can also be spotted on this G-SYNC module. From the datasheet, we can confirm that this is, in fact, a total of 3 GB of DDR4-2400 memory. This memory is likely being used in the same lookaside buffer roll as the 768MB of memory on the original G-SYNC module, but is much higher capacity, and much faster.
While there's not a whole lot we can glean from the specs of the FPGA itself, it starts to paint a more clear picture of the current G-SYNC HDR situation. While our original speculation as to the $2,000 price point of the first G-SYNC HDR monitors was mostly based on potential LCD panel cost, it's now more clear that the new G-SYNC module makes up a substantial cost.
It's an unstocked item, without a large bulk quantity price break, but you can actually find this exact same FPGA on both Digikey and Mouser, available to buy. It's clear that NVIDIA isn't paying the $2600 per each FPGA that both sites are asking, but it shows that these are not cheap components in the least. I wouldn't be surprised to see that this FPGA alone makes up $500 of the final price point of these new displays, let alone the costly DDR4 memory.
It’s not even real 10bit I
It’s not even real 10bit I believe it is a 8+2 which is plain stupid.
as soon as I read active fan,
as soon as I read active fan, I switched off. $ 2000 for an active fan monitor? No thanks very much.
Same here…not a big fan of
Same here…not a big fan of the “Fan”….no pun intended
Curious as to how the desktop
Curious as to how the desktop at SDR is handled when displaying content in HDR in a window. The rest of the desktop getting dimmer would seem to be the expected result. SDR is supposed to specify a range of 0 to 100 nits while HDR can specify a range of 1000 nits, 4000 nits, or more. When you switch the display to some HDR mode, anything that is SDR will have to be converted to HDR in some manner. If you just convert it into 0 to 100 of a 1000 nit range, it will look very dim. Is there a shift in the whole screen when you start an HDR video? You can’t blow up the SDR content to the full HDR range since it would probably cause significant banding.
Windows has a slider that
Windows has a slider that effectively sets the SDR content peak brightness level. By default it’s at 0 which looks like 100 nits to my eyes.
However it does not perform gamma correction and the SDR content is incorrected displayed on the display’s PQ gamma curve.
Curious as to how the desktop
Curious as to how the desktop at SDR is handled when displaying content in HDR in a window.
Monitors don’t work the way you think they do.
SDR doesn’t define any brightness range at all. The brightest 8-bit value (255,255,255) simply means “brightest”, whereas what that actually means in terms of luminance depends on the brightness setting of your monitor. Nothing more.
HDR values map to actual luminance levels, so the brightness setting on the monitor is typically deactivated/irrelevant. You only get HDR in full screen mode. In a window on the Windows desktop you can only ever get an SDR image, not least because consumer grade GPU’s refuse to deliver a 10-bit signal unless forced into full screen HDR mode.
So If we are fine with 98Hz
So If we are fine with 98Hz is this then going to be able to deliver the colour accuracy and contrast as advertised? I want one as Im an artist that does gaming and wants the best of both without the need for seperate displays.
Good first impressions. Now,
Good first impressions. Now, please, send the monitor to Rtings for the review.
What’s the minimum refresh
What’s the minimum refresh rate for GSync?
How quickly we forget. I
How quickly we forget. I remember buying the 24 inch SONY CRT monitor GDM-FW900 for $2 grand and it weighed a ton. Best CRT Trinitron monitor I ever had. Now I have the ASUS PG279Q monitor and I love it but it was also not cheap. I would get this monitor but even my EVGA GTX 1080 Ti FTW3 would not be able to do it justice. I suppose I should get a 4Kish TV first, heh.
Your comment about games not
Your comment about games not being able to push UHD / 4K at 144 Hz is far from correct. You are forgetting about older games. For instance, the original Far Cry is still a great game and runs at 180 fps at 4k.
However, it’s clear that Displayport 1.4 has insufficient bandwidth, so it’s probably worth waiting for HDMI 2.1, which should, and the refreshed monitor and new GPU to run it.
In what universe does a 27
In what universe does a 27 inch LCD with bad backlighting compare to a 55 inch LG OLED? I have a 1080 Ti and I wouldn’t buy this with YOUR money.
This one? I have both of
This one? I have both of these things and there’s no comparison whatsoever. In fact, I was the first consumer in the US to purchase and review LG’s 4K OLED when they went on sale in 2015.
You need to treat a 55 inch LG OLED like a plasma, which means even in an ideal light controlled environment, the damn thing’s going to dim itself noticeably in a typical PC color scheme with white dominating everything.
So on DisplayPort 1.4 it’s
So on DisplayPort 1.4 it’s 3840×2160+10bit+HDR+4:4:4 @98 Hz,
how high will the upcoming ultra-wide models go at 3440×1440+10bit+HDR+4:4:4?
Not sure if I did the math correctly but shouldn’t it be 165 Hz (panel’s max is 200 Hz according to announcements)?
You are correct that scaling
You are correct that scaling is proportional to the pixel count assuming everything else is identical.
Also, I still haven’t been given an answer about DSC which allows visually lossless compression and if the monitor and GPU support it you can actually achieve 4K@240Hz.
But the chart (about halfway down in link below) also shows 4K@120Hz using 4:4:4 without DSC so why is this monitor only at 98Hz?
Something doesn’t add up.
https://en.wikipedia.org/wiki/DisplayPort
Update: I’m an IDIOT. That’s 4:4:4 at 8-bit, not 10-bit.
These are the numbers using
These are the numbers using exact bandwidth calculations:
10-bit = 159 Hz
8-bit = 198 Hz
Coming from 1080P TN Monitor,
Coming from 1080P TN Monitor, should I buy 4k 144hz IPS Monitor?
I actually own a pg27uq and
I actually own a pg27uq and it is BEAUTIFUL BUT… I have noticed problems Running a 3930k@4.8ghz, 32GB 1866mhz cas9, 2x titan x (Pascal) window 7pro 64 on samsung 840 with latest nvidia drivers, and window 10 pro 64 on samsung 850 evo also latest drivers. Now with that out of the way the monitor seems to cause tons of microstutter in games even with sli disabled I get serious microstutter even in titles like destiny 2 that ran fine on my XB280hk 4k 60hz I am getting 90-120 fps in destiny but sometimes it goes to stutter mode and even the menus it stutters and the only solution is at tab, although enter repeatedly and hope or closing and reopening and hoping, and in games like fallout 4 that only go to 60 fps the games like this become a horrible unplayable stuttery mess with momentary stalls followed by black screens the gameplay resumes and there is no fixing even lowering the monitor to 60hz just makes it worse has anyone else experienced this?
Do you have GSYNC enabled or
Do you have GSYNC enabled or not?
If NO turn it on and if YES try disabling it temporarily… possibly try GSYNC OFF but force Adaptive VSync Half Refresh (which should cap to 72FPS VSYNC ON but disable VSYNC if you can’t output 72FPS).
Can you force an FPS cap using NVInspector or some other tool?
I assume GSYNC is ON and you obviously don’t want it off but for now maybe try the different options so you can at least narrow down the problem.
Maybe its an NVidia driver issue, but a quick Google doesn’t show much so maybe its simply early adopter blues that will get sorted out.
Also, maybe try shutting down and physically removing a video card. I know SLI is supposedly off in software but it’s all I can think of if the obvious isn’t helping.
Plus of course contacting the monitor manufacturer but I do think it’s a software issue.
I really hope we start seeing
I really hope we start seeing more size and resolution options; 1440p at 27 and 32 inches would be a great start. I’d also love to see a 35 or so inch 4k model.
Great review! Considering
Great review! Considering that it has an active fan, I am curious what the power consumption is under different settings/scenarios. Does the power consumption vary with refresh rate,GSYNC,HDR, etc? The specs say max 180W, which is a *lot* for a monitor. It would be great to update the review with this info. Thanks.