For a mere $300 you can see for yourself what the Quantum Dot displays we have been hearing rumours about for a few years now. It remains an IPS panel and offers an impressive 99% Adobe RGB, or more than 100% of the standard sRGB colour gamut but at a price far below many professional grade monitors. It delivers a brightness of 300 cd/m2 and a dynamic contrast ratio of 20,000,000. It is a 16:9, 1080p display with a response time of 5ms, perhaps not as impressive as the variable refresh rate or a 4K monitor but if accurate colour reproduction is what you need then this display will certainly be worth consideration.
Fremont, California – March 15, 2016 – Today EPI (North America brand license partner for Philips Monitors) announces the world’s first quantum dot-based monitor (E-line 276E6ADSS) is now available in North America. The new 27-inch monitor delivers 99% Adobe RGB color – 50 percent more color than traditional LED displays – thanks to Color IQ™ technology from QD Vision. The new E6 is ideal for entertainment, gaming, professional photography and design. It combines Color IQ optics with full HD resolution, resulting in a professional quality display at the price of mainstream monitors. The Philips 27-inch Quantum Dot display is now available on Amazon for $299.
QD Vision's Color IQ™ solution uses an innovative new semiconductor technology called quantum dots to precisely and efficiently convert light, delivering bluer blues, greener greens and redder reds. The result — vibrant, dynamic, “you-gotta-see-it-to-believe-it” color.
Most of today's high-end monitors are capable of displaying only 95% of the Adobe RGB color gamut, while mainstream models are often limited to showing 70% of the Adobe spectrum. Using QD Vision's Color IQ solution, the Philips 27-inch Quantum Dot Display will deliver over 99% of the Adobe RGB spectrum – more than 100% of the standard sRGB color gamut – but at a fraction of the price of commercial displays.
The IPS-ADS display offers 1920 x 1080 resolution, 60Hz refresh rate, 1000:1 static contrast, 5ms response time and 178°/178° viewing angles, making it possible to view the display from almost any angle. Unlike standard TN panels, IPS-ADS displays give you remarkably crisp images with vivid colors, making them ideal not only for photos, movies and web browsing, but also for professional applications that demand color accuracy and consistent brightness at all times. Ports include; VGA, DVI-D, HDMI (with MHL) and a 3.5mm audio output jack.
99% Adobe? Isn’t that, like,
99% Adobe? Isn’t that, like, at least 12 bit? It’s essential for high quality color and picture working, but for absolute majority of mainstream tasks this actually might hurt more than give benefit. Mainly because any color range above 10 bits introduces such color steps that are simply not supported by absolute majority of mainstream software (i.e. – games and/or simple video editing programs), which usually results either in absolutely incorrect visual presentation or in outright refusal of a said mainstream program to work any. Gamers and mainstream video/image editors really don’t need more than 10 bits, so anything above that is really suitable only for professional segment of the PC user base (especially 14 bit panels).
Nope, that isn’t at least 12
Nope, that isn’t at least 12 bit. Adobe RGB only supplies the color temp of a monitor and the value of the Red Green and Blue values.
8bit, 10bit, and 12 bit denote the number of steps between each color shift.
Gamers are currently using 8 bit displays, not 10 bit displays. They are also using sRGB. The display industry is currently in a shift towards the BT 2020 color space for the new UHD spec, which includes HDR capabilities. HDR requires more than 8bits, and frankly should require more than 10bits, in order to display without banding.
This display is 5 years ago top display. It isn’t what you would want to be gaming on 2 years from now.
Here is an image that shows the benefit of BT 2020 over sRGB and Adobe RGB. Basically, ever wonder why your monitor can’t display cyan? It displays a color that we call cyan, but it can’t display the actual measured color. In the image, all the color represents what the human eye can actually see. (The colors are a representation, your monitor cannot display all the actual colors the human eye can see.) The triangles represent the amount of colors your monitor can display in each spec compared to what your eyes can see. sRGB is roughly 35% of the colors your eye can distinguish, Adobe RGB is roughly 50 percent, and BT 2020 is roughly 75%. Cinema has used the DCP color space, that covers roughly 53% since the start of digital cinema.
The web standard is currently limited to sRGB. (Browsers may choose to use imformation in videos and images to show different colorspaces, but the colors of fonts and CSS, anything that is controlled by html, is all sRGB.)
http://www.tvfreak.cz/ceka-uhd-stejny-osud-jako-hd-ready/5184/img/body-7.3B00.jpg
Banding is an interesting issue. In the examples you provide, the banding is caused by the program not having enough information to use all of the available color choices. We have had a fix for that for ages, in photoshop and video editing software you just add a bit of random noise to break it up.
In the case of BT 2020, banding is caused by the larger intermediate steps between any two colors. ( I am not going to go into how color values work on displays.) Adobe RGB monitors also has this problem.
In the case of 8bit HDR, the display doesn’t have enough colors to display all of the information available. 10bit can still show some banding. The sweet spot for HDR is 12bit displays.
So what does HDR provide? Your eye can perceive roughly 1000 separate light levels at any one time. The sRGB standard and the Adobe RGB standard on provide for displays to show about 100 different levels of light at anyone time. The new HDR standards provide a minimum of 100 different levels of light at a time. We already see display manufacturers showing 2000 levels or more.
Why is this important? For games, this means we can explore a more realistic scene, as well as giving us a larger playing field with colors and light. Games are art, and art has always pushed the boundaries. We will see games that look like we are looking through a mirror, and games that use only shade of orange or gray, because you will be able to differentiate between them to a much better extent.
Um…I think you’ve
Um…I think you’ve misunderstood my point somewhat. I’ve assumed it’s at least 12 bit, because lower bit panels (like 6 bit + FRC and/or 8 bit) simply cannot into such Adobe color space coverage. They physically cannot do this, no matter how hard you’ll be ripping your ass trying to perfectly calibrate them, and no matter what panel technology you would be using. 10 bits allow for a much greater Adobe color space coverage, and 12 bits is usually where the “99%” of Adobe color space coverage starts coming from, the professional panels. Also – you’ve clearly ignored/missed what I’ve been saying regarding the factor of incorrect visual representation and typical consumer software problems that might arise with the usage of panels that are higher than 10 bits. I never said that gamers or typical mid-level video and image editors can’t buy/use such panels, my point was that they typically prefer to stay with 6 bit+FRC/8 bit/10 bit mainly due to these panels being quite cheaper/more affordable for wider consumer audiences and also simply because such panels guarantee that there will be no possible inconsistencies with color representation and/or possible problems with typical consumerist software. And because such panels hold the absolute major product share on the worldwide market, 12+ bit panels aren’t just very expensive, but also quite rare and uncommon. That’s why an average gamer or average consumer-level video and image editor would usually go with a 6 bit+FRC/8 bit, or a 10 bit panel, but not with 12 bit or higher. People prefer not to have possible problems that might arise out of the panel’s “bitness” itself, that’s hy they go with lower bit offerings.
As I said before, color space
As I said before, color space coverage doesn’t have any direct relation ship to bit depth.
The Adobe RGB standard defines the wavelengths of light for each of red, green and blue, and the min/max intensity of that wavelength. This has nothing to do with bit depth. Period.
I agree that consumers have gone with the cheaper offerings, but not that they care about compatibility with the display. Why? Because the 10 bit and higher displays can operate in an 8 bit mode. And Windows does this by default, unless you install the monitors ICC equivalent profile. I tend to think the reason is sole because the price difference until now has been 10-100x of the price of cheap displays. This has had a larger affect on the purchase decision. Thanks to the price dropping, we will see which one of these is true.
10 bit or higher displays have been expensive, but the price is dropping, and will continue to drop as UHD compliant displays are manufactured. UHD requires 10 bit minimum, with optional 12 bit on the high end.
And all of these displays will have an sRGB compatible mode, thanks to the work of the Movie industry.
Lets explain the whole bit thing.
First, it has nothing to do with the color space. If you can make a 32 bit display, you can choose to adhere to the sRGB color space, the Adobe RGB color space, or even the ACES color space. (Hint, the ACES color space covers the entire human vision range, but it does this by starting with wavelengths that are out side our vision. You may be able to display things humans can’t see, but then, humans can’t see it.)
Instead, start with any of those previous color spaces. Lets take the red color, and so blue and green are empty. So for red, to display at maximum intensity, on an 8 bit display, the value given to the display is 255. On a 10 bit display that value is 1023.
Important!!! Those values correspond to the exact same intensity value. Your max doesn’t change.
For the lowest intensity, that value is 1 for both.(0 pertains to not display any of the wavelength.) This value also represents the same intensity.
Here were it gets a little more complex. For an 8 bit display, a value of 2 represents 2/255 of the difference from the min intensity to the max intensity.
For the 10 bit display, 2 represents 2/1023 of the difference from the min intensity to the max intensity. this value is lower than the 8 bit displays 2. All that raising the bit depth of a display does, is give you more intermediate steps between the min and the max intensity of the wavelengths.
There are problems, of course. We can get close approximations for images that are 8 bit on a 10 bit or higher display, but they are hardly ever the exact same color. This is a property of the math. In reality, this matters less than you would think.
Then we of course add blue and green into the mix, and they work the same way. This works in a linear fashion. But our eyes don’t really work that way. It sees differences in orders of magnitude more easily than these steps. (This is a vast oversimplification.)
So for sRGB, we add something called a gamma curve. for the new standard UHD and BT 2020, there is a compatibility layer added that handles this Gamma for you, so all your old stuff still works. Why did they do this? Because they came up with something better, that takes into account technology changes and advances that aren’t from the 1950’s NTSC standard. (Thats right folks, your display has been using a standard, sRGB, from the early days of television.)
What I tried to explain last post was the difference between the color space and the bit depth. I hope I have done a better job this time.
Excellent explanation Bill!
Excellent explanation Bill! Thanks for taking the time to educate us!
You could cover AdobeRGB in
You could cover AdobeRGB in 4-bit per channel if you want, although there would be a lot of banding as the represented colors would be fairly far apart. The bit depth has nothing to do with the color space.
There is one big difference
There is one big difference though. The new 4k blu-ray spec includes HDR video, this monitor isn’t 4k but it will be able to take advantage of the potential for increased dynamic range. From what I have read during the last couple of CES’s the increase of colour-gamut in HDR video is more noticable than the move from 1080p to 4k.
Also, I just remembered that Neflix already offers HDR video streaming for shows like House of Cards, Marco Polo, etc.
Based on the listed static
Based on the listed static contrast ratio and maximum brightness, this display is unlikely to support HDR content.
This display supports the
This display supports the Adobe RGB color space. HDR content will be authored for the BT 2020 color space. Or the sRGB color space. This display wont receive optimized content.
HDR is different than color space. it refers to the number of brightness levels the display can handle. For the true HDR experience, you want 1000 nits. This display also doesn’t support that.
This display would be a bad long term investment. Unless you buy monitors every 2-3 yrs, then this is ok if you need slightly higher end specs now. If not, wait a few more months and check out UHD compatible displays. (There should be some displays taht come out that meet all the UHD specs, but are lower resolution, meaning they can’t be certified, but give you all the advantages, but don’t require 4 Fury X/ Titan X cards.
27″ and a pathetic 1080p res,
27″ and a pathetic 1080p res, ok maybe if you are legally blind but no serious pro would use such low res screen. My 27″ screens are 1440p and I’ll be looking to 4K or 5K in next 1-2 years.
Hardware needs to catch up to new colour standards, especially consumer video cards and we need to purge world of crappy 8 bit and sRGB
For $299? The fact that
For $299? The fact that QLED's will be able to provide this level at that price is what is interesting; as well as what the high end models will look like.
It’s Quantum Dots, d00d. That
It’s Quantum Dots, d00d. That is both expensive as hell AND hard to make into extremely high resolutions, so, typically, a 1920×1080 QD panel is being priced at least twice as higher than it’s IPS counterpart, because QD is still essentially an experimental technology (just like OLED is) and the yields are very small. It’s hard as F to make a truly good 2560×1440 QD-based panel, that’s why they’ve decided to go by little steps – going “1920×1080|27” first. 2560×1440 and 4K QD panels would come in the next couple of years, but as of right now the technology is still too raw and unpolished for this to be both cheap and mass enough for industry to go higher resolutions than 1920×1080. It’s pretty much the same situation as was with IPS in the mid-2000s. Just try to remember how many years has passed since the industry actually started releasing decent 2560×1440 IPS panels in big quantities, not even mentioning actually GREAT panels (not just “decent”) with low response times and etc…you’ll be surprised.
Dat giant playschool toy
Dat giant playschool toy sized plastic bezel, fixed height stand, 1080p on a 27″, Nope.
Even with this monitor’s
Even with this monitor’s shortcomings, which honestly are just whiny rich kid complaints that really don’t matter at all, this sets an important precedent about improved color reproduction at an affordable price. The average LCD isn’t anywhere near even the average CRT when we were transitioning to LCD and they still made CRTs. This should help actually progress display technology that has been stagnant for like 10 years.
There is misinformation in a
There is misinformation in a number of these comments. This monitor uses quantum dots to expand the color gamut of the display. That’s it. This is just a wide color gamut IPS monitor. HDR, good black levels, fast response times, etc. aren’t guaranteed features. They all need to be correctly implemented by the manufacturer on a case-by-case basis.
Quantum dots being released in consumer products is a good thing as it will push the adoption of BT 2020 and HDR content if quantum dots are able to compete with WLED in consumer monitors. OLED is still the panel type you’ll want in 5-10 years when the technology has matured.
I’m looking for a monitor
I’m looking for a monitor that will not trigger migraines in my daughter. I think perhaps it is from the back lighting and possible local dimming, if I am correct, many monitors (all ?) use pulse coded modulation to regulate brightness, and I think this may be the issue for her.
Does this qdot monitor use PCM backlighting or is it d by dot photo luminescent?
Thanks !