I wrote a post on my personal website taking a look at 4K from a filmmaking/workflow perspective, a post that subsequently sparked some debate amongst my filmmaker friends. While that topic may not wholly align with our enthusiast needs here at Home Theater Review, the concept behind it is one that I feel many of you may find interesting, even a little enlightening. As many of you know, 4K is coming and soon, and while some of you may be excited at the prospect of more pixels enriching your viewing experience, I ask, why now? You see, it appears that, going off of what I've seen, read and been told, our soon-to-be-4K universe is going to look an awful lot like our current HD one. Now, what if I told you that no one at the mass market consumer level (that's us) has even seen HD? It's true.
If you ask any home theater enthusiast what the best HD source is, he or she is bound to tell you it's a Blu-ray player spinning a Blu-ray disc. However, what if I told you that Blu-ray was but a watered-down version of HD, and that there was still more to the experience yet to be tapped? Our current Blu-ray standard calls for 1920x1080 pixels with 8-bit color sampled in 4:2:0/4:2:2 in the Rec. 709 color space with a maximum bit rate of around 30M bits per second, using some form of MPEG compression scheme. That sounds good, until you begin to look at what today's modern HDTV displays are actually capable of displaying. For instance, most (if not all) HDTVs nowadays are 10-bit-capable devices - computer monitors are often more than double that. We choke the life out of the color and subsequently the image by feeding them a steady diet of compressed 8-bit color, something that more than likely will not change when we welcome 4K into our homes in 12-24 months. Going further down the rabbit hole, our modern HD displays are capable of displaying the larger DCI color space, yet due largely to broadcast standards and constraints, we only use the Rec. 709 color space. Again, you've paid for more performance that's squandered. And then there is bit rate. Because we have to compress the living daylights out of everything in order to get things to play faster and on more compact formats, we must limit the bit rate (or transfer of data) to go along with such compression schemes. This limit is imposed by us and is not part in parcel of the HD "standard." Truth is, properly captured, even compressed, HD can achieve bit rates in excess of a150Mbits/second, which is a big deal. More bits per second equals a sharper, less compressed image, or better yet, a proper HD experience.
Instead, we're marching forward towards 4K. What for? We haven't even seen HD yet. The "pipe" needed to facilitate proper HD playback, let alone 4K, simply isn't robust enough. So the powers that be are going to fit what they can through the existing pipe and hope no one notices. And how could we? We've never seen the image properly displayed in the first place. So maybe it isn't goodbye HD; after all, you can't miss what you haven't met. But then again, if the only difference between our current HD world and our impending 4K one is resolution, we're not about to meet 4K either. Isn't progress great?