Home Theater Review

 

What You Need to Know About HDMI 2.0

Subscribe to our FREE weekly newsletter Print this article

 

HDMI-cable-thumb.jpgIn the third and final installment of our "State of Ultra HD" overview, we look at HDMI 2.0. We've already discussed the better color of quantum dots and the better contrast of high dynamic range. Now it's time to talk about that little cable that will link your new UHD TV to all your other gear--you know, the one that promises such simplicity in its setup but often creates such confusion in its execution.

The first-generation 4K Ultra HD TVs used HDMI 1.4, which only allowed for the passage of a 4K resolution at up to 30 frames per second and couldn't do 4K 3D. Then HDMI 2.0 arrived, promising compatibility with 4K at 60fps, 3D support, and lots of other goodies. You'll now see "HDMI 2.0" attached to most every recent UHD TV and AV receiver/processor, so we're good, right? We have everything we need for future 4K content, right?

Not so fast. The fact is, not every HDMI 2.0-labeled device has the full capabilities to handle every aspect of Ultra HD. You may have seen questions from commenters on this site asking if a certain UHD device includes HDCP 2.2 or can handle 4K/60 with a 4:4:4 color space. These features are missing from many early HDMI 2.0 devices. Why? Well, let's address each issue separately.

First, the issue of color space. To understand what a 4:4:4 color space is, check out this simple, concise explanation on chroma subsampling. The very short summary is, the human eye isn't as good at seeing color resolution (chroma) as it is at seeing brightness/black-and-white resolution (luma), so it's preferable to compress color within a video signal if you need to save space. "4:4:4" describes the full signal with no color compression, while 4:2:2 and 4:2:0 describe different methods of color compression (with 4:2:0 being the most compressed).

Needless to say, a 4:4:4 signal requires a lot more bandwidth than 4:2:0, which is why our current Blu-ray, DVD, and HDTV broadcast systems all use 4:2:0 color compression, and your source device or TV has to decompress it to the full RGB 4:4:4 signal. Check out the diagram below, provided by DVDO, which shows the bit rate required for 1080p and 4K signals at various refresh and subsampling rates. Notice the big jump in bit rate between a 3,840 x 2,160 signal at 60fps and 4:2:0 and a 3,840 x 2,160 signal at 60fps and 4:4:4--from 8.9 Gbps up to 17.82 Gbps. 

DVDO-HDMI-chart.jpg

The problem is, when the HDMI 2.0 spec was originally released, no chips or hardware actually supported the higher 18-Gbps/600MHz rate of HDMI 2.0. So how is it that products arrived on the market with the HDMI 2.0 designation? Good question. Since almost all the new features in the HDMI 2.0 spec are written as optional, a manufacturer needed only to add a single new feature--like support for 4K/60 at the 4:2:0 color space--and then complete the certification process through a test house like Simplay Labs to earn the HDMI 2.0 label. The beauty of 4K/60 at the 4:2:0 color space was that it could be passed over existing chips that were designed for the former HDMI 1.4 spec and its 10.2-Gbps/300MHz rate. Rather than wait for a new HDMI chip to arrive, many manufacturers forged ahead on this path.

Now we are seeing the arrival of HDMI chips that support the higher 18-Gbps/600MHz rate. The receiving chips for display devices arrived first, and now the transmitting chips have also arrived for AV processors, switchers, etc. So this year's UHD devices are more likely to support 4K/60 at the full 4:4:4 color space. But what if you own a device that doesn't? Some manufacturers are providing upgrade paths for their gear; others are not. Is the inability to pass or receive 4K/60 at 4:4:4 really a huge issue? Not necessarily, at least in the short to mid term. As I said earlier, current Blu-ray and HDTV standards use the 4:2:0 color space, and it looks like the upcoming Ultra HD Blu-ray spec will follow suit (check out this report showing the "leaked" Ultra HD spec). In time, our distribution methods may evolve to 4:2:2 or full 4:4:4, but we're probably years away from that.

The second issue is one of copy protection, and this is where your first- or second-gen UHD gear may hit a roadblock. HDCP is the copy protection method used between HDMI devices for a secure digital transfer. HDCP 2.2 is the latest iteration, which will be implemented on future Ultra HD Blu-ray devices and other 4K-capable set-top boxes. If you buy an Ultra HD source device with HDCP 2.2 in place, then every HDMI device in your chain--be it your AV receiver, video switcher, sound bar, or TV--also needs HDCP 2.2 to establish that all-important handshake for viewing UHD video.

The first-gen Ultra HD TVs did not include HDCP 2.2 because it wasn't available yet, and it requires a hardware upgrade (not a firmware update) to get it. HDCP 2.2 was included on many of last year's UHD displays, although not necessarily on every HDMI input. Most of this year's promised offerings from the major TV manufacturers have it, too--but be more careful with the smaller, lesser-known brands.

On the electronics side, Onkyo was one of the first AV receiver manufacturers to add limited support for HDCP 2.2 in 2014, and this year the copy protection method will appear on more devices. Some companies like Denon, Marantz, and NAD have promised an upgrade path to add HDCP 2.2 to some of their most recent products. It's best not to assume anything, though--be sure to look for HDCP 2.2 support if you plan to embrace upcoming 4K sources.

One possible workaround for those who don't own HDCP 2.2 audio components is a 4K player with a dual-output design. We can hope that some early Ultra HD Blu-ray players will offer two HDMI outputs: one with HDCP 2.2 to connect directly to your UHD display and another audio-only output to connect with your "older" audio equipment. (This is how a similar compatibility issue was addressed when 3D arrived.) We'll find out soon enough, as the official Ultra HD Blu-ray spec should be announced by early summer at the latest, and we expect details on the first players to follow shortly thereafter.

It certainly looks like 2015 is the year that Ultra HD will make major headway on both the content and equipment sides of the equation. Hopefully over the past few weeks, we've provided you with some insight to help make a more informed purchasing decision.

Additional Resources
The Good, Better, and Best HDTVs on the Market Today at HomeTheaterReview.com.
Four Reasons Why Ultra HD Is Becoming More Relevant to Consumers at HomeTheaterReview.com.
Is Your Ultra HD TV REALLY an Ultra HD TV? at HomeTheaterReview.com.

  • Comment on this article

Post a Comment
comments powered by Disqus