A long time coming
We finally have the ASUS G-SYNC HDR monitor in our hands!! Is it worth $2000?
To say that the ASUS ROG Swift PG27UQ has been a long time coming is a bit of an understatement. In the computer hardware world where we are generally lucky to know about a product for 6-months, the PG27UQ is a product that has been around in some form or another for at least 18 months.
Originally demonstrated at CES 2017, the ASUS ROG Swift PG27UQ debuted alongside the Acer Predator X27 as the world's first G-SYNC displays supporting HDR. With promised brightness levels of 1000 nits, G-SYNC HDR was a surprising and aggressive announcement considering that HDR was just starting to pick up steam on TVs, and was unheard of for PC monitors. On top of the HDR support, these monitors were the first announced displays sporting a 144Hz refresh rate at 4K, due to their DisplayPort 1.4 connections.
However, delays lead to the PG27UQ being displayed yet again at CES this year, with a promised release date of Q1 2018. Even more slippages in release lead us to today, where the ASUS PG27UQ is available for pre-order for a staggering $2,000 and set to ship at some point this month.
In some ways, the launch of the PG27UQ very much mirrors the launch of the original G-SYNC display, the ROG Swift PG278Q. Both displays represented the launch of an oft waited technology, in a 27" form factor, and were seen as extremely expensive at their time of release.
Finally, we have our hands on a production model of the ASUS PG27UQ, the first monitor to support G-SYNC HDR, as well as 144Hz refresh rate at 4K. Can a PC monitor really be worth a $2,000 price tag?
Specifications
There's a lot of ground to cover with the specifications of the ASUS PG27UQ, and many of them represent industry firsts.
While there has been one other Korean monitor that has supported 4K 144Hz, the ASUS PG27UQ is the first widely available display to take advantage of this new 4K high refresh mode offered by DisplayPort 1.4.
On the HDR side, you have HDR10 decoding powered by a 384-zone Full Array Local Dimming (FALD) backlight, capable of reaching 1000 nits in certain scenarios. As per the DisplayHDR 1000 specification, the display must be capable of both flashing the entire screen with an instantaneous brightness of 1000 nits, as well as sustaining a 10% patch in the middle of the display at 1000 nits indefinitely.
The typical brightness of 600 nits is still higher than the measured peak brightness of any other HDR monitor we have taken a look at so far, an impressive feat.
Of course, brightness isn't the only important aspect of HDR. ASUS claims the PG27UQ is capable of reproducing color to an accuracy of 99% of the AdobeRGB gamut, and 97% of DCI-P3. While it isn't the absolute highest claims of color reproduction we've seen on a display, these numbers still represent color accuracy in the top echelon of pc monitors.
Edit: For clarification, the 98Hz limit represents the refresh rate at which the monitor switches from 4:4:4 chroma subsampling to 4:2:2 subsampling. While this shouldn't affect things such as games and movies to a noticeable extent, 4:2:2 can result in blurry or difficult to read text in certain scenarios. For more information on Chroma Subsampling, see this great article over at Rtings.
For the moment, given the lack of available GPU products to push games above 98Hz at 4K, we feel like keeping the monitor in 98Hz mode is a good compromise. However, for a display this expensive, it's a negative that may this display age faster than expected.
Here is the full capability of the PG27UQ across different refresh rates and modes:
SDR: 98 Hz 10-bit RGB, 120 Hz 8-bit RGB, 144 Hz (overclocked) 8-bit YCbCr422
HDR: 98 Hz 10-bit RGB, 120 Hz 8-bit RGB (dithered, only with Win10 RS4), 120 Hz 10-bit YCbCr422, 144 Hz (overclocked) 10-bit YCbCr422
One caveat though, with HDR enabled at 4K, the maximum refresh rate is limited to 98Hz. While this isn't a problem for today, where graphics cards can barely hit 60 FPS at 4K, it is certainly something to be aware of when buying a $2,000 product that you would expect to last for quite a long time to come.
One of the unique advantages that ASUS has with the PG27UQ over the similarly specced Acer Predator X27 is DisplayHDR 1000 certification. While it's unclear if this is due to ASUS getting access to slightly higher quality panels than Acer, or Acer just not going through the certification process, it still reflects the confidence that ASUS has in this product.
Design
The first surprising aspect of the PG27UQ is the form factor. While monitors have been trending towards thinner and thinner chassis and bezels due to advancements in LCD and LED backlighting technology, the PG27UQ goes in the opposite direction.
The bezels are thicker than we saw with even the original ROG Swift, the chassis is thick, and the monitor overall is quite heavy. It seems that some sacrifices had to be made in order to fit things like a Full Array Local Dimming backlight into a 27" display form factor.
Inclusions like ASUS Aura Sync-compatible RGB lighting help make this feel like a unique product if you are into that sort of thing.
However, for those of you like me that aren't into RGB, these lightning effects can be easily disabled in the monitor's OSD.
Like we've seen with a lot of G-SYNC displays, the input options are a bit barren on the PG27UQ. You get the primary DisplayPort 1.4 connector for all of your G-SYNC enabled content, an HDMI 2.0 port, Dual port USB 3.0 Hub, and a headphone jack.
We were pleasantly surprised to see that the HDMI 2.0 port supports full 4K60 with HDR-enabled, allowing you to hook up a device other than your PC, like a game console to this premium product and still get the fullest possible experience. Although keep in mind that Dolby Vision HDR is not supported here, so your mileage may vary with regards to playing back certain video content.
Early impressions of the similar Acer Predator X27 uncovered that it used active cooling for the electronics, which was annoyingly loud when used with Acer's VESA mounting kit.
While the ASUS PG27UQ also features a blower style fan with an intake in the VESA mount area, ASUS includes standoffs which provide plenty of clearance between the back of the monitor and the VESA mounting plate, solving the issue.
It's worth noting that despite the monitor having an always-on active cooling fan, it was only noticeable with no other ambient noise in the room, and in close proximity to the display. With any other noise in the room, such as our test PC turned on, the fan noise was drowned out.
Still, if you shut down your PC when not using it, and keep your PC setup in a quiet room, the best course of action would be to completely turn off the monitor when not using it.
Speaking of the fan, let's take a closer look at the internals of the ASUS PG27UQ with a teardown.
Review Terms and Disclosure All Information as of the Date of Publication |
|
---|---|
How product was obtained: | The product is on loan from ASUS for the purpose of this review. |
What happens to the product after review: | The product remains the property of ASUS but is on extended loan for future testing and product comparisons. |
Company involvement: | ASUS had no control over the content of the review and was not consulted prior to publication. |
PC Perspective Compensation: | Neither PC Perspective nor any of its staff were paid or compensated in any way by ASUS for this review. |
Advertising Disclosure: | ASUS has purchased advertising at PC Perspective during the past twelve months. |
Affiliate links: | This article contains affiliate links to online retailers. PC Perspective may receive compensation for purchases through those links. |
Consulting Disclosure: | ASUS is not a current client of Shrout Research for products or services related to this review. |
3840 x 2160 is not 4K, what
3840 x 2160 is not 4K, what the hell mates! this is UHD
if you don’t know the difference, what the hell are you doing in the tech review media?
Everyone knows. Nobody cares.
Everyone knows. Nobody cares. Shut up.
The term “4K” is generic, and
The term “4K” is generic, and refers to any resolution with a horizontal pixel count of approximately 4000 pixels. Several different 4K resolutions have been standardized by various organizations.
yea, doesn’t the marketing
yea, doesn’t the marketing term “HD” refer to both 720 and 1080p? At least in the tech world?
No, the term for 720p is HD
No, the term for 720p is HD and 1080p is called FullHD/FHD.
It DOES mean 4K if that’s the
It DOES mean 4K if that’s the accepted usage by enough people. In fact, dictionaries are constantly modified to change words to what people use whether wrong or not.
Try going into WalMart, stand in the TV section, and tell people that those aren’t 4K HDTV’s… no they are 3840×2160 so DAMN IT… Samsung should know better!!
You need to stop. You’re
You need to stop. You’re wrong. It’s a fact 3840×2160 is 4K. It’s widely accepted, except by people who think they’re smart when they’re not.
Accept you’re wrong and move on with your life.
TL;DR: 4K = 2160p, that’s the
TL;DR: 4K = 2160p, that’s the standard the industry adopted.
The default aspect ratio these days is 16:9, so 16:9 2160p = 3840×2160
There *ARE* 4096×2160 monitors, but it’s a wonky aspect ratio only used by the Digital Cinema Initiatives standards body so it’s almost exclusively used by movie projection to efficiently support 2.39:1 through 1.85:1 aspect ratio projections. This is indeed where the term “4K” started.
But 4096×2304 monitor panels aren’t made, never have been, never will be, however 3840×2160 has numerous benefits in relation to the existing 1080p/720p sizes because scaling is exact and precise.
So kindly do shut the heck up about “That isn’t 4K!” because every standards body has accepted that it is.
Equally meaningless. You
Equally meaningless. You speak of DCI-4K as if it were the only thing referred to as 4K while being completely unaware that UHD is not, in fact, a single resolution.
UHD-8K exists, DCI-8K doesn’t. So, let’s continue this stupid discussion in the future when digital cinema catches up to UHD and you’ll swear up and down that movie theaters aren’t 8K.
You guys didnt go into much
You guys didnt go into much detail that Youtubers are providing.
Is it running 4:2:0 / 4:2:2 / 4:4:4 ?
Is the FALD noticeable during letterbox in-game cinematics or media content movies, online videos.
Should be running 4:2:2 if
Should be running 4:2:2 if you go above 98Hz. Really this is a 4K 98Hz monitor. The 144Hz stuff was sort of marketing BS. Chroma subsampling has never really been used on desktop monitors.
Actually, if not allow
Actually, if not allow sub-sampling, it can reach 120hz for 4k SDR and 98hz 4k HDR. Anything above will require subsampling(4:2:2). So I would say it’s a 4k 120hz monitor.
It doesn’t look
It doesn’t look good.
http://www.guru3d.com/news-story/asus-and-acer-uhd-g-sync-hdr-monitors-forced-to-use-color-compression-at-120144-hz.html
PCPER can you find out about
PCPER can you find out about THIS?
I’m confused still:
https://en.wikipedia.org/wiki/DisplayPort
Go down roughly half way and there’s a chart (Bandwidth is based on uncompressed 4:4:4.) that shows 4K at 144Hz using 4:2:2 with DP1.3…
Then right next to it we see 4K at 144Hz using DP1.4 under DSC (lossless compression). Actually supports up to 240Hz via 4:2:0 or 4:4:4 via DSC.
My understanding is that the need to drop to 98Hz or drop to 4:2:2 is a problem that DSC resolves.
NVidia PASCAL does support DSC:
http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,2.html
(correct me if I’m wrong but if DSC is supported by v1.4 it’s required for a card stating “DP1.4” to have DSC?)
It’s inconceivable this monitor would not.
*So we should be able to get up to 4K 4:4:4 on this monitor?!!
DSC is “visually” lossless,
DSC is “visually” lossless, not mathematically.
It’s not a requirement for
It’s not a requirement for cards with DP1.4 to support DSC. I believe that is an optional feature. Regardless, Nvidia’s Gsync module does not support DSC, even though Freesync scalers do.
Nice to know that $500 in the
Nice to know that $500 in the BOM goes to a module that was out of date before it launched. xD
1) Do you have any PROOF that
1) Do you have any PROOF that GSync Modules lack DSC support? I find that highly unlikely considering how long the standard has been out AND its high relevance to the purpose of the GSync Module.
2) As for “visually lossless” above. While I appreciate the distinction there are two issues then:
a) First off, I thought that implied it was difficult if not impossible for the average person to tell the difference between true lossless and visually lossless, and
b) if DSC was available then why bother switching to 4:2:2 subsampling which apparently DOES have obvious artifacting and/or color differences?
It seems likely to me that if DSC was available it should be utilized.
So either one, some or all of the following lack support for DSC:
a) NVidia Pascal (and other) GPU’s,
b) the Asus monitor
c) NVidia drivers
d) Windows?
I thought DSC was already utilized for Workstation setups though I could be wrong but either way I’m baffled why DSC isn’t apparently working here.
At this point my BEST GUESS is that it’s most GPU’s (even Pascal?) that lack support and that the MONITOR probably has it.
So, just curious, is HDR
So, just curious, is HDR *only* able to be enabled at 4K? I know that using monitors at their non-native resolution is not ideal, but is there anything from stopping a user from buying this and running it at 2560×1440 at 144 Hz with HDR enabled?
Thanks!
Here is a buyers guide –
Here is a buyers guide – https://www.toppctech.com/best-144hz-monitor-reviews/. You may find your answer.
Think its possible to remove
Think its possible to remove and repurpose the FPGA?
Best comment! 🙂
So true. The
Best comment! 🙂
So true. The FPGA is over $2000 to buy separately, and here you pay $2000 and get not only the FPGA, but 3GB of RAM on the board and a very good LCD panel.
Best comment, really! XD
Best comment, really! XD
Really dissapointed with this
Really dissapointed with this product. Chroma subsampling on a $2000 monitor? Are you kidding me? And on top of that we have to deal with shady marketing which doesn’t even mention that you have to run the monitor at 98Hz to avoid it. Not to mention the fan noise and other quality control issues. Asus should have spent another couple of months and got this thing right. For a $2000 product this seems to have a lot of compromises.
Mine arrived two days ago and
Mine arrived two days ago and I am very satisfied with it. Haven’t noticed any fan noise, HDR is really amazing, it turned the far cry 5 into a totally different game! Now I feel it really worth the price.
Glad Far Cry 5 looks great,
Glad Far Cry 5 looks great, but there may be times when the subsampling artifacts are noticeable according to at least one review.
Also, did the game run above 98FPS for you?
I assume you used GSYNC so unless you output more than 98FPS the monitor was under 98Hz and thus running full 4:4:4 and not reverting to subsampling.
To COMPARE I wonder if there’s a way to force 4:2:2 instead of 4:4:4 then switch back and forth (below 98FPS/98Hz) to compare the difference.
Pretty much all of the video
Pretty much all of the video you see, unless you happen to be a professional video producer, is going to be chroma sub-sampled in some manner. It would only be noticeable for small text really. If you are playing a game where you really think you need 144 Hz at 4K, then you probably are not playing a game where you are reading a lot of small text. You probably aren’t going to be able to achieve 144 Hz in most games at 4K with good quality anyway. If the FPS is really, really important to you, then you could always run it at 1080p with all of the bells and whistles and 144 Hz.
I am not a fan of g-sync at all. Nvidia went to a lot of trouble to try to keep it proprietary without changing the standards. This is exactly the type of thing that should be added to standards though. There should be no vendor lock in between video cards and displays. It would be nice if they just make variable refresh a required part of the standard, then nvidia would essentially have to support it. The continued expense of using FGPAs is also a problem. It is going to be difficult to do HDR color accuracy and brightness accuracy with a variable refresh rate, but this still doesn’t require a g-sync TCON. It is just going to take a while for standard ASIC TCONs to catch up. It will require a lot of work to get the controller to achieve good color and brightness accuracy, but it is required for FreeSync 2. Designing an ASIC for the job will take longer, but it will be a lot cheaper in volume and probably will not require an active cooling solution, although with the bandwidth required, who knows. Hopefully we will get at least quantum dot enhanced computer displays; I don’t know if there are any available yet. That makes it easier to achieve the color accuracy and brightness for HDR. It will still require very precise control to make it work with varying frame times. OLED probably isn’t coming to computer displays due to burn in issues with static content, so the best solution seems to be quantum dot enhanced LCD right now.
The difficulties with 144 Hz aren’t really anything to do with variable refresh. If you wanted to run at a constant refresh of 144 Hz HDR, you would still have those limitations due to the higher bit rates required. This just doesn’t come up with TVs since video content generally never exceeds 60 FPS. It would generally be better to implement compression rather than sub-sampling, so that is something that could be done until we have the entire system capable of supporting the necessary bit rates. Running at the ridiculously high bit rates that is required for 4K HDR at high refresh rate is just going to be problematic. A lot of old and/or cheap cables will not be able to do it, which will cause all kinds of problems.
I’m not sure I agree with all
I’m not sure I agree with all your points though they are valid concerns.
1) Good luck FORCING asynchronous to be part of a standard and being forced to implement it (do you have to support EVERYTHING in DP1.4 or DirectX12 on every GPU? No.) Yes, it would be nice but I don’t know how you could force that.
2) This monitor is Quantum Dot already.
What we still need is further improvements on the LCD panel or preferably switch completely to OLED once OLED manufacturing improves image retention (not burn-in) and reduces cost.
Multiple LED backlights are used to provide local dimming but that would not be necessary with better LCD panel control to avoid light leakage or OLED that produces its own light per sub-pixel.
3) How come AMD escapes peoples wrath anyway? They could have allowed NVidia cards to work with Freesync. NVidia probably would have put support in their drivers as it would sell more GPU’s (which is probably more important than selling more GSync monitors).
NVidia worked to create asynchronous monitors which we likely would not even have without their investment. Somebody had to spend the cash to get this started. Hey, I’d prefer an open standard too instead of a walled garden but if we’re going to play the blame game then AMD should be in there too.
Hopefully Freesync 2 HDR (I believe it’s “2” and “HDR” both) HDTV’s and monitors for gaming consoles helps put pressure on NVidia at some point but so long as there’s no financial incentive good luck. That’s just how business works.
(XBox One X is experimenting with Freesync 2 HDR… good video at Digital Foundry. FYI, it’s still got some big issues so hopefully that can be mostly sorted out in driver support. For example, one game stuttered when it was in the asynchronous range. Why? Well, it’s a lot more complicated than you may think it should be.)
4) NVidia made a proprietary module as they apparently discovered the current methods would be limited such as the difficult of overdrive working with a variable refresh rate so we agree on that. They discussed adding other features later on too.
You may question the use of an FPGA programmability is the REASON for that, plus it stands to reason they would find ways to reduce cost when it makes sense because I’ve got to assume there’s little profit for NVidia in this and perhaps they are even LOSING money… in fact the mentioned early on that they wanted to add support in later modules for V-by-One instead of using LVDS to reduce cost so certainly cost has been on their minds from early on so if they pay up to $500 per module it’s not sustainable.
So certainly an ASIC is coming. I have no doubt a low-cost solution will show up in the near future (next couple years). Possibly they still need the flexibility of programming the FPGA via firmware until they get the bugs ironed out, including programming for each PANEL type (AHVA etc).
A custom module might be as cheap or even CHEAPER to make a similar monitor in Freesync later on too as the (ASIC) module price should plummet but much of the work done by NVidia won’t need to be repeated by individual monitor manufacturers (like they are being forced to do now with Freesync 2 HDR that adds to the cost).
So if the cost can drop to $50 or whatever for a drop-in solution (which already also eliminates the scaler), AND if it gives some benefit vs Freesync then it should be all worth it for NVidia. I have to assume that’s the long-term plan.
Probably iron out all the HARDWARE bugs to the point they can just load up individual panel specs from a separate chip into the ASIC.
5) Compression vs sub-sampling: not my area but if cost is negligible then we should simply have BOTH options then implement whichever makes the most sense such as DSC 4:4:4 should the GPU and monitor support it or 4:2:2 sub-sampling if not.
6) Cables: don’t disagree, but that’s progress. Not sure if a proper DP1.4 cable is included or not but for $2000USD I certainly hope so.
3) How come AMD escapes
3) How come AMD escapes peoples wrath anyway? They could have allowed NVidia cards to work with Freesync. NVidia probably would have put support in their drivers as it would sell more GPU’s (which is probably more important than selling more GSync monitors).
Your bias appears to be clouding your technical comprehension.
Any GPU can work with Freesync. It’s an open VESA standard, meaning any VESA member has access to the spec and can implement it.
AMD isn’t preventing anybody from supporting FreeSync. nVidia is deliberately not supporting FreeSync for obvious reasons.
This issue isn’t a “blame game”. It’s just facts. Like most companies, nVidia has an economic interest in tying customers to their “ecosystem” and making it more difficult for customers to switch to the competition. That’s part of what G-SYNC does. That’s not evil. It’s just business. But it’s clearly also not what is best for consumers.
AMD doesn’t do this. If the tables were turned and AMD was in the dominant position, AMD would behave no differently however. It’s business!
Ah, well Freesync original
Ah, well Freesync original may be open source but “Freesync 2 HDR” which is what they are transitioning to DOES require an AMD graphics card. I guess I’m half right so I’m half wrong too.
https://www.tomshardware.com/news/amd-freesync-2-hdr-lfc,33248.html
“There’s a lot of coordination that needs to happen between game developers, AMD, and display vendors, so it remains to be seen how enthusiastically AMD’s partners embrace FreeSync 2, particularly because the technology is going to be proprietary for now.”
Actually, OLED burn-in is an
Actually, OLED burn-in is an issue that needs to be resolved. Even with LG’s panel aging nonsense that goes on when you turn their set off, my 4K OLED has the LG calibration menu permanently burnt-in.
Of course LG is stupid for using a white menu system while manipulating the TV’s built-in controls, but they at least did us all a favor and put to rest any doubt that OLED will burn in.
Panasonic or any respectable vendor would have DCC calibration. Probably better to just use a separate video processor until LG figures out what the hell they’re doing.
Burn-In is apparently a
Burn-In is apparently a non-issue for newer HDTV’s. The issue now is Image Retention. The difference is that burn-in is permanent or at least can be difficult to get rid of whereas Image Retention is similar to ghosting in that a pixel can retain its color for too long if held at the same or similar frequency and/or brightness (I don’t know the exact details).
THIS sort of agrees AND conflicts with what I said though (you have to just read it):
https://www.cnet.com/news/oled-screen-burn-in-what-you-need-to-know/
It’s partly confusing because OLED screens vary in type and of course PHONES typically have a more static image.
*anyway, some articles say a recent, quality OLED HDTV is almost impossible to achieve BURN-IN whereas Image Retention remains an issue. One is permanent the other is not.
Either way it makes screens unsuitable as desktop monitors until this is sorted out.
Any panel uniformity issues?
Any panel uniformity issues? Based on your naked eye. The 1440p 27in 144hz gsync displays were garbage in QC.
Reading this article on the
Reading this article on the monitor being reviewed, I got mine yesterday
lol, me too.
lol, me too.
Whats It like, im wanting to
Whats It like, im wanting to get one as Im a games art student, I want great colour accuracy but also high refresh so I can use it for work and games.Not out in the UK yet and prices are stupid high but Im hoping its worth it?
Hopefully nobody buys it so
Hopefully nobody buys it so nvidia will be forced to support freesync.
There are at least 2 people
There are at least 2 people in the comments here claiming to have bought it already. I really wish people wouldn’t for the exact reason you’ve posted but some people don’t care and others need it for their epeen, so who knows.
Freesync 2 require the use of
Freesync 2 require the use of a proprietary API by game developers to works with HDR
Hmm does gsync work with
Hmm does gsync work with hdmi? Or was that old rumor dud.
“For clarification, the 98Hz
“For clarification, the 98Hz limit represents the refresh rate at which the monitor switches from 4:4:4 chroma subsampling to 4:2:2 subsampling. While this shouldn’t affect things such as games and movies to a noticeable extent, 4:2:2 can result in blurry or difficult to read text in certain scenarios. ”
You can go up to UHD @ 120Hz at 8 bits per pixel, the 98Hz DisplayPort limit is for 10bpp. While you would be advised against using a wide gamut output at 8bpp (due to more noticeable banding on gradients), you can still enable FALD and use the larger dynamic range at 120Hz. This also means 120Hz sRGB desktop use is not an issue.
——
I’d also be willing to wager that with the 11xx/20xx series cards, 144Hz 4:4:4 may be an option with RAMDACs ‘overacheiving’ over the DP 1.3/1.4 clock rate of 810MHz. That could be using the notional ‘DP 1.5’ pre-final standard (as Nvidia are part of VESA) or it could be a ‘just for Nvidia’ clock rate.
There’s no RAMDACs on new
There’s no RAMDACs on new graphics cards. No analog outputs no digital to analog converter. I really don’t know if that DVI like pixel overclock works through displayport. But yeah if that could work, that gsync module has to be able to take that clock too.
RAMDACs have been the norm
RAMDACs have been the norm for so long it just ends up as the colloquial term for “that PHY layer bit” even if it’s actually a TMDS modulator!
DisplayPort does not ‘overclock’ like DVI or HDMI does, as it is not a pixel-clock-based transport but a packetised transport. There are 4 valid symbol rates (162MHz, 270MHz, 540MHz, 810MHz) and if your resolution/refresh rate does not ‘use up’ all the bandwidth available at that symbol rate that bandwidth is just ‘wasted’.
For Nvidia to push more bandwidth via DP, they would either need to be using the notional DP 1.5 symbol rate (with no guarantee their cards will ever by DP 1.5 compliant when the standard is ratified, because there are more changes than that and all sorts of timing requirements to meet), or just an arbitrary rate that they can pick because they make the devices on both ends of the cable (the GPU, and the G-Sync panel controller).
Rumored asus as put hold on
Rumored asus as put hold on stores from selling and anyone who purchases should return for firmware update these monitors are garbage for what you are paying not even 110 bit panels lol
https://old.reddit.com/r/nvidia/comments/8t406e/asus_recalls_their_4khdrgsync_monitors_from/
How does it compares with
How does it compares with Samsung CHG70?
1-99-9999! Ha!~
1-99-9999! Ha!~
There is no need to run the
There is no need to run the monitor in 10-bit YCbCr422 120 Hz mode for HDR. 8-bit RGB 120 Hz is visually identical, with no banding. It’s only required that the application render to a 10-bit DirectX 11 surface. The NVIDIA driver automatically performs 10-bit to 8-bit dithering which eliminates banding, allowing you to run the game in 8-bit RGB 120 Hz mode and avoid chroma subsampling. The 10-bit YCbCr422 mode is only really needed for consoles or Blu-Ray players on the HDMI port.
In my previous observation of
In my previous observation of HDR panels, games running in 8-bit HDR certainly had noticeable banding in areas of sky / other areas with low contrast. Also, if there was no need for 10-bit, then it wouldn’t be part of the standard.
It’s still required that the
It’s still required that the game render to a 10-bit DirectX 11 surface to prevent banding. The GPU to monitor signal alone can be 8-bit RGB since the GPU performs dithering automatically.
Switching on HDR in a game should make it switch from a 8-bit surface to a 10-bit surface. If it doesn’t happen automatically then it’ll cause banding.
10-bit is part of the
10-bit is part of the standard because the *source content* needs to be 10-bit to present small enough steps that don’t result in banding in high luminance scenes. Because it should be possible to transmit 10-bit YCbCr422 video from a Blu-ray player to the display without further processing (like dithering), HDR10 displays are required to accept a 10-bit signal.
However if the source can perform dithering, an 8-bit signal is sufficient to transmit 10-bit content without banding. In the case of a PC, 8-bit RGB is clearly preferable to 10-bit YCbCr422. If some games are revering to 8-bit rendering in HDR mode, then it’s still a software issue that can be addressed.
The NVIDIA driver creates
The NVIDIA driver creates banding in this scenario, actually. If allowed to run in fullscreen exclusive mode, any 10-bit RGB:A surface on all G-Sync displays I have tested get a delightful red/green banding that resembles chromatic aberration.
The only workaround is to force the game into windowed mode, and engage flip model presentation rather than BitBlit. This allows G-Sync to work and prevents said artifacts.
I don’t have this issue with
I don’t have this issue with my LG OLED. I always set the display to 8-bit RGB, and a 10-bit DX11 surface is always dithered without banding in both borderless window and fullscreen exclusive mode.
True Blurring Arrives. :p
True Blurring Arrives. :p
Who would really use 4K on a
Who would really use 4K on a 27″ monitor. Way to small for that kind of resolution.
I have a 24″ 4K monitor. Also
I have a 24″ 4K monitor. Also had a laptop with a 15″ 4K screen. Your comment is inaccurate.
Ha ha, Nvidia made too many
Ha ha, Nvidia made too many GPUs thinking that the GPU coin mining craze was a stable market and now Nvidia has Pascal SKU stocks up to the rafters. So no new generation until the older generation’s inventories are moved.
Rally Nvidia the only reason your stripped of compute Pascal SKUs were purchased by miners was because they could not get their hands on enough AMD Polaris/Vega units owing to AMD getting Burned the first time around with so much unsold inventorty to write off.
Now that there are plenty of Polaris/Vega SKUs in stock with all their shader heavy designs able to be used for mining and the miners are not needing and Pascal(Less shader cores than AMD’s SKUs) for mining. The price of coin has fallen and now Nvidia is having to put off introducing Pascal’s sucessor for gaming.
Old JHH over at Nvidia has taken to weaing a London Fog full length trench coat while he stands in the back ally trying to sell some overstocked inventory.
True HDR? A “True HDR”
True HDR? A “True HDR” display doesn’t have a *REAL* native contrast ratio of only 1000:1
It’s like the awful dynamic contrast ratio BS that has been pushed for years, only slightly less bad.
Real HDR displays don’t exist, and won’t exist until this industry makes real concrete improvements to the foundation of display technology instead of adding more issues on top of the existing ones.
True HDR? A “True HDR”
True HDR? A “True HDR” display doesn’t have a *REAL* native contrast ratio of only 1000:1
It’s like the awful dynamic contrast ratio BS that has been pushed for years, only slightly less bad.
Real HDR displays don’t exist, and won’t exist until this industry makes real concrete improvements to the foundation of display technology instead of adding more issues on top of the existing ones.
I’d say you are PARTLY
I’d say you are PARTLY correct… don’t forget there is localized backlight control (384 zones) so that works in tandem with the brightness of pixels.
For example if an area is supposed to be light grey then the backlight for that area drops the light a lot to minimize light leakage.
It’s not the same as individual pixel control, however it’s also far better than an identical monitor without individual LED zones to control.
It’s obviously still problematic to have pixels that are very BRIGHT in the same LED zone as pixels that are meant to be very DARK since you need to keep the LED bright in that case thus the pixels that should be dark will be more GREY.
Not perfect but a big improvement. There are monitors coming with more than a thousand zones, plus OLED will eventually get sorted out and probably replace everything.