The HDMI Forum has introduced an update to the HDMI specification, bringing the video standard to version 2.1. The updated specification, along with its accompanying new "48G" (48 Gbps) HDMI cable, brings support for higher resolutions refresh rates, and color spaces along with new features such as dynamic HDR, a variable refresh rate "Game Mode VRR", and eARC for audio device detection and object oriented audio (e.g. Dolby Atmos).
Specifically, HDMI 2.1 adds support for 8K resolutions at up to 60 Hz and 4K at up to 120 Hz along with HDR (high dynamic range). The specification is even a bit future looking in that it allegedly supports 10K50/60/100/120 modes! The 8K@60 and 4K@120 (and higher) profiles do require the new 48 Gbps cable though lower resolutions can still get by with the older High Speed cable. The specification also supports BT2020 color spaces with 10, 12, and 16 bits per color component which I expect Ken and Allyn will appreciate.
Perhaps the most interesting new feature though is the Game Mode VRR which appears to be HDMI's take on DisplayPort's Adaptive Sync (which AMD uses for FreeSync). At last year's CES AMD was showing off FreeSync over HDMI (video) with AMD doing FreeSync over HDMI as an extension of the specification. It now appears that HDMI is rolling some manner of that variable refresh technology into the base HDMI 2.1 specification. Variable refresh rates being supported with HDMI is a good thing as it means that future game consoles may see their own FreeSync/G-Sync like variable display output options as I do not see game consoles and living room devices (TVs, receivers, et al) adopting DisplayPort any time soon if only because of the huge install base and foothold HDMI has on that market.
Notably, HDMI 2.1 remains backwards compatible with earlier specifications, cables, and devices based on older HDMI standards including the Ethernet channel and inter-device communication. Existing devices will be able to use HDMI 2.1's 48 Gbps cables but will not be forwards compatible with all of the new features (though partial new feature support might be possible with firmware updates though in no way guaranteed).
The new specification is expected to officially drop in early Q2 2017 at which point it will be available to all HDMI Adopters for testing.
I estimate that, following the compliance testing and device QA, products using the new specification should start shipping as soon as next year (at CES 2018 perhaps!). It is harder to say when graphics cards or game consoles will start supporting the new output though. I would hope that AMD and NVIDIA would be able to sneak it in before Vega and Volta based cards launch respectively but the timing may not have lined up like that. And on the game console side of things, Microsoft and Sony have already launched their revised consoles this year save Scorpio so it might be awhile before they sport variable refresh. Perhaps JoshTekk and the crew will have some thoughts on the podcast next week!
What are your thoughts on HDMI 2.1? Will it lay the groundwork for interesting displays and better living room gaming?
PC Perspective's CES 2017 coverage is sponsored by NVIDIA.
Follow all of our coverage of the show at https://pcper.com/ces!
It may force nvidia to
It may force nvidia to support ‘open’sync and openup market for freesync monitors for nvidia users. I would love to see open WhateverSync stanardized shared by all same time next year. So that you dont have to buy monitor in corelation to gpu.
Dawid Igras I completely
Dawid Igras I completely agree with you.
The g-sync / freesync battle needs to end.
Bad for consumers.
From what I’ve read device manufacturers can even software patch HDMI 2.0 devices to support VRR…
HDMI is the worst at naming
HDMI is the worst at naming conventions. Try to search for it and you’ll get a lot of audio components.
Maybe if you try qualifying
Maybe if you try qualifying the HDMI acronym with the word/term “Standard” such as the phrase “HDMI Standard” your next attempt at Google-Fu may be more successful! Be warned that Google’s AI is more intentionally engineered to steer you towards merchants rather than any standards organization’s materials and white papers. It’s an eternal uphill battle fighting Google’s search AI that places ad/merchants search results over any academic/other types of researching online!
I would even try Google-FU-ing “HDMI 2.1 Standard” to better do battle with Google’s search AI! It’s not really the HDMI standards organization’s fault!
P.S. the author of this
P.S. the author of this article did provide a link to the HDMI forum but the author should have properly labeled the hot-link “HDMI Forum’s website”, instead of just underlining some non descriptive text(“introduced an update” [Underlined]) that was part of his article’s wording. It’s not a very good writing style that has come to dominate the online presses confusing and nonstandard writing methodology. The author should be commended for at least underlining any hot-links and NOT merely highlighting in color any hot links as that can and deos have a definite ill effect on those that are color blind, are you listening management of NPR’s website!
Edit: deos
to: does
dyslexia
Edit: deos
to: does
dyslexia and color blind!
I also placed the link in the
I also placed the link in the source field (lower right corner of the article pane, just above the comments. 🙂 I'll keep that in mind though!
I looked through the HDMI
I looked through the HDMI consortium FAQ material and am still unclear on how they are accomplishing the added bandwidth. Are they adding signal lines to the cable and cramming them into unused space on the connector or are they just running the existing signal lines wayyyy faster?
48Gbps. If you read it
48Gbps. If you read it carefully they state that they support 8K and 10K VIDEO (which already uses chroma compression 4:2:0) up to 120Hz (HFR+HDR). When they say “uncompressed”, they mean that the data transport stream is not compressed when transmitting compressed chroma video. (think about it as the difference between emailing a RAW file vs a JPEG file vs a RAW file in a ZIP file).
What HDMI 2.1 will do is provide uncompressed HDR 4K 120Hz 10-bit using an uncompressed stream. But when you send video (4:2:0), the bit rate is literally halved by the chroma compression, so the stream itself doesn’t need compression.
If you just do the math, you can figure it out as well.
48Gbps limit:
14.93Gbps = 4K 8-bit 60Hz 4:4:4
17.92Gbps = 4K 10-bit 60Hz 4:4:4
29.86Gbps = 4K 8-bit 120Hz 4:4:4
35.83Gbps = 4K 10-bit 120Hz 4:4:4
and so on…
Interesting information,
Interesting information, thanks for sharing!
Yes I know it’s 48Gbps. But
Yes I know it’s 48Gbps. But how are they doing it? More lanes/conductors on the cable/connector or faster signaling rates? or both?
There doesn’t seem to be much
There doesn’t seem to be much detail available yet, but the connector hasn’t changed so it must be faster signal rates.
Yes, the Pin-Outs for both
Yes, the Pin-Outs for both the cables and the Ports with some nice graphics and those pins labled! Man I wish they would add some file transfer abilities that’s some bandwidth.
Also there is HDMI Alternate Mode for USB Type-C. [Wikipedia entry has a pin-out graphics for Alt. mode HDMI over Type-C.
“HDMI” [Wikipedia]
https://en.wikipedia.org/wiki/HDMI
Isn’t thunderbolt 3 supposed
Isn’t thunderbolt 3 supposed to be 40 Gb/s?
What does HDMI Alt. mode over
What does HDMI Alt. mode over Type-C have to do with TB3 over Type-C! USB Type-C is a Plug form factor and electrical standard from USB-IF. TB3 over Type-C is its own use case, and the first product ever to use TB was the Sony Vaio laptop that used a USB A plug form factor/port receptacle to host the TB pins and connect to an external Sony GPU box.
The first Intel TB testing was through a USB Type-A connector before Apple went with using a Mini-DP type of form factor plug to use with Apple’s TB Macbooks. There is even a native Display Port Alt. Mode over Type-C plug standard for delivery of DP directly to external monitors that support the Type-C form factor and DP alt. mode, no TB3 required. Using the USB-IF’s Type-C plug standard allows for that power delivery along with whatever Alt mode DP, HDMI, Other native protocol signaling supported. TB3 is a tunneling protocol that can alternatively tunnel Ethernet(10 GB), DisplayPort, HDMI, PCIe, etc. over the TB3 Protocol, but some of the other alt. Modes are for running native Display Port, HDMI protocol, etc. signals over Type-C plugs/receptacles that have no TB3 feature sets or ability.
What are you rambling about?
What are you rambling about?
“Man I wish they would add some file transfer abilities that’s some bandwidth.”
While 40 Gb/s is slower than 48 Gb/s, it is pretty close.
“Isn’t thunderbolt 3 supposed
“Isn’t thunderbolt 3 supposed to be 40 Gb/s?”
It’s 40 Gbs and HDMI 2.1 is 48Gbs so HDMI 2.1 is not going to be able to be sent via TB3! What are you rambling about?
No not close at all!
Man I wish they would add some file transfer abilities that’s some bandwidth!
Good luck transferring files
Good luck transferring files over HDMI!!!!!!!!!!!!
Will game vrr require
Will game vrr require anything other than hdmi 2.1 and a the new cable to work? If not adaptive sync monitor producers are going to take a hair cut on their margins, the party will be over for sure.
The HDMI 2.1 controllers
The HDMI 2.1 controllers (chipsets) will be the only required display hardware – much like how the DisplayPort/HDMI controllers are the only hardware required inside current FreeSync monitors. No special G-Sync-like module required, no extra cost markup required. According the information they provided, using the game mode will bypass all other settings to provide the lowest possible latency.
I’m sure nvidia will find a way to make their iteration a premium feature of some kind, maybe even take the credit and call it GSync2 or something. I’d like to know if FreeSync 2 was built with HDMI 2.1 in mind. After all, AMD first showed FreeSync over HDMI 1.4 years ago.
Nvidia? Support something
Nvidia? Support something that goes against GSync and it is free? Yeah, right. We saw it with VESA’s Adaptive Sync.
AFAIK Nvidia supports 3D TV
AFAIK Nvidia supports 3D TV standard for extra $50 or so. This time might be extra $100, I think.
So dumb that no one is
So dumb that no one is interested in passive 3D gaming.
Scrgb wasn’t being used but
Scrgb wasn’t being used but 2020 will? Rofl