Dmitry Novoselov of Hardware Canucks saw an NVIDIA SDR vs HDR demo, presumably at Computex based on timing and the intro bumper, and noticed that the SDR monitor looked flat. According to his post in the YouTube comments, he asked NVIDIA to gain access to the monitor settings, and they let him… and he found that the brightness, contrast, and gamma settings were way off. He then performed a factory reset, to test how the manufacturer defaults hold up in the comparison, and did his video based on those results.
I should note that video footage of HDR monitors will not correctly describe what you can see in person. Not only is the camera not HDR, and thus not capable of showing the full range of what the monitor is displaying, but also who knows what the camera’s (and later video processing) exposure and color grading will actually correspond to. That said, he was there and saw it in person, so his eyewitness testimony is definitely valid, but it may or may not focus on qualities that you care about.
Anywho, the test was Mass Effect: Andromeda, which has a native HDR profile. To his taste, he apparently prefers the SDR content in a lot of ways, particularly how the blown out areas behave. He claims that he’s concerned about game-to-game quality, because there will be inconsistency between how one color grading professional chooses to process a scene versus another, but I take issue with that. Even in standard color range, there will always be an art director that decides what looks good and what doesn’t.
They are now given another knob, and it’s an adjustment that the industry is still learning how to deal with, but that’s not a downside to HDR.
100% fabricated…. HDR is so
100% fabricated…. HDR is so hyped and could have been exposed years ago.
First : Black is black . 0 black is zero black. SDR or HDR.
Next is white. Most people set max brightness on SDR to be confortable to see with 100% white levels.
So we configure 100% white to be comfortable, so we reduce brightness level.
HDR simply allow you to keep you monitor at 100% brightess, but 100% white is now maybe 60% vs 100%.
60% white with a 100% brighness will be equal to 100% white with 60% brightness.
Issue, in one case you can go past 60%, and you will get very strong bright levels.
Maybe its not clear… but its possible to turn any monitor with enough brightness level into an HDR monitor. Your only need to configure your tone mapping accordingly.
Some very old monitor can actually deliver some amazing HDR experience.
But the best you could get is via super bright OLED and 10bit.
If you have an SDR monitor with 10 bit, they could be turned into stunning HDR displays.
I mean… you’re not wrong,
I mean… you’re not wrong, but it’s a good thing that we finally have some sort of standard to rally behind (which means native HDR content) and push for acceptance of 10-bit panels at consumer-grade level.
I mean, if you put it in
I mean, if you put it in these simple terms, sure. Though I’m not sure how a monitor rated for 350 nits could possibly match one that’s rated for 1000. But besides brigther brights, HDR also means blacker blacks and a (much) wider colour gamut, which is most certaintly something modern LCD displays could use. It will take a while for things to standardize and for display technology to fully match optimal HDR specs, sure. And for devs to learn all the tricks of the HDR trade.
As I see it, HDR monitors are worth it for the long awaited debut of full-array local dimming alone. Since we’re seemingly years away from affordable OLED monitors, FALD is the next best thing.
Long awaited debut of FALD?
Long awaited debut of FALD? That was on TVs a decade ago, along with RGB LED backlights. TVs and monitors have gone to shit since those days, when everything became W-LED edge lit trash.
HDR does not mean wider color gamut. Wider color gamuts like the Rec.2020 standard, which is also 10 or 12 bit with provisions for HDR do.
High dynamic range just means youll see more in the dark and light areas of a single scene.
I’m well aware of the fact
I’m well aware of the fact that FALD used to be pretty common in LCD TVs years ago. Not in monitors, which was the point I was making. So, yes, long awaited debut is a perfectly legit way of putting it. I know I’ve been waiting for a good while.
HDR does mean wider color gamut for the simple fact that it’s the display tech that’s spearheading the push towards 10-bit+ panels in consumer LCD displays. Of course you can have a 12-bit panel that isn’t “HDR”, but that’s, again, beside the point.
FALD RGB LED monitors do
FALD RGB LED monitors do exist currently and have for a while, but they’re all pro monitors like Dolby PRM 4200 and 4220, Canon DP-V3010, and i believe the Sony PVM X300.
Apparently Panasonic has a new light modulating technology for IPS panels that goes in between the backlight and the LCD panel to allow pixel level local dimming. The Eizo ColorEdge CG3145 is apparently using this, but im not sure how long it will be before that trickles down to consumer level stuff. The Dell UP2718Q also has 384 dimming zones.
Like the guy in the video said though, there are complications inherent to HDR, especially since there isnt yet a widely adopted standard for implementing it, and several competing HDR inclusive standards.
It will probably take a while before sRGB goes away, but i agree the sooner the better! Its sad that you can get phones like the S6, S7 and S8 with 2560×1440 OLEDs with incredible dynamic range, brightness, perfect sRGB and Adobe RGB profiles, but you have to spend thousands to get a similarly performing monitor.
HDR is another diversion to
HDR is another diversion to allow TV and monitor makers continue doing what they have been doing since WCG-CCFL and RGB LED backlit TVs and monitors were replaced almost a decade ago by W-LED backlit garbage: continue to sell cheap to manufacture monitors with terrible color compared to older, better and more expensive to manufacture tech.
A decade ago you could buy a WCG-CCFL or FALD RGB LED backlit TV or monitor for the same price as an edge lit W-LED gimmicky TV or monitor that sucks today.
Those old TVs and monitors rival the image quality of the best current TVs and monitors with a few exceptions:
quantum dot enhanced LED
All of those are considered “exotic” and expensive tech, and its pathetic that my 10 year old Sony Bravia TV looks better than the majority of TVs on the market today, with the exception of the OLED or QD ones.
We need Rec.2020 OLED, QD or RGB LED TVs and monitors already.