Published On: April 28, 2014

Why You Need To Know About Gamma on Your HDTV

Published On: April 28, 2014
Last Updated on: October 31, 2020
We May Earn From Purchases Via Links

Why You Need To Know About Gamma on Your HDTV

If you've read a lot of TV reviews (or even just few, actually), you've likely seen mention of a TV's gamma, a performance characteristic that helps determine the overall accuracy of the grayscale. Unless it's an entry-level model, your TV...

Why You Need To Know About Gamma on Your HDTV

  • Adrienne Maxwell is the former Managing Editor of HomeTheaterReview.com, Home Theater Magazine, and HDTVEtc.com. Adrienne has also written for Wirecutter, Home Entertainment Magazine, AVRev.com, ModernHomeTheater.com, and other top specialty audio/video publications. She is an ISF Level II-certified video calibrator who specializes in reviews of flat-panel HDTVs, front video projectors, video screens, video servers, and video source devices, both disc- and streaming-based.

ID-100107006.jpgIf you've read a lot of TV reviews (or even just few, actually), you've likely seen mention of a TV's gamma, a performance characteristic that helps determine the overall accuracy of the grayscale. Unless it's an entry-level model, your TV probably includes multiple gamma options in the Picture setup menu, with numeric choices that generally range from 1.8 to 2.6. What is gamma, what do those numbers mean, and which one is the right choice for your system? We're here to answer those questions for you.

The gamma curve dates to back to the days of CRT TV. If you imagine a graph showing the relation of light output (vertical axis) to input signal level (horizontal axis), the ideal result would be a straight diagonal line at a 45-degree angle extending from zero - i.e., 20 percent brightness at a 20 percent input signal level, 30 percent brightness at a 30 percent input signal level, etc. However, that's not how CRT TVs behaved; instead, it produced a nonlinear curve. According to the Imaging Science Foundation, a 50 percent input signal level produced only about 18 percent light output (which corresponds to a numerical gamma of 2.5). Content creators decided to compensate for this by incorporating the exact opposite curve into the source to result in perfectly linear output. That's why you'll often see it referred to as gamma correction.

Additional Resources

In today's digital world, TVs can offer linear output, but the gamma correction existed in so much content that - like many early tricks of the trade - the system needed to carry over to the digital realm. Therefore, TV manufacturers were forced to add gamma correction to make the digital TV act like a CRT TV. For much of the history of digital displays, 2.2 has been the target gamma for the TV to perfectly offset the content and create a linear output. As always, though, the system has evolved, and 2.2 is no longer considered the optimal gamma setting for every situation. The ISF continues to recommend 2.2 for TV watching in a dim viewing environment, but it recommends 2.4 for a completely dark room and 2.0 for a bright environment. How do these numbers alter what's being shown on the TV? Good question.

The best way to describe how gamma affects picture quality is that gamma represents the level of brightness difference between each step in the grayscale, or how "fast" blacks get brighter. The human eye is much more sensitive to changes at the dark end than at the bright end, which is why a correct gamma setting is particularly important for darker film scenes. Picture a grayscale test pattern with reference black and white at each end, which are adjusted by the TV's brightness and contrast controls, respectively. Gamma affects the steps in between. A lower gamma number like 1.8 makes blacks get brighter faster, so the mid-blacks and grays will look lighter. A higher gamma number like 2.4 keeps blacks darker longer, so those same bars will look darker.

Click on to Page 2 to to learn about viewing in light and dark rooms and what BT.1886 is all about . . .

ID-100227858.jpgIn a completely dark room, you want your blacks to look darker, as it helps to improve image contrast - but you don't want them to be so dark that you can't see the finest black details in the picture. Much like the problem of setting the brightness control too low to try and make blacks look darker, a gamma of 2.6 or darker is going to start hiding those fine details in dark scenes. Conversely, a gamma of 1.8 would cause the blacks to look grayish and may even expose too much low-level noise in the darker picture areas.

For daytime viewing in a well-lit room, it's less crucial that your blacks stay darker longer, and the TV can benefit from the increased brightness that a lower gamma number can provide in the grays and whites. That's why a lower number of 2.0 is acceptable in these viewing conditions.

Here's something to keep in mind, though. Just because you select your TV's 2.4 gamma option doesn't mean you're necessarily getting a 2.4 gamma. The 2.4 option could be too dark (2.6) or too light (2.2) across the board, or there could be significant peaks and dips along the way (local dimming, when enabled in an LED/LCD, can skew the gamma results). That's one of the things we look at when measuring a TV with a meter. The SpectraCal CalMAN software I use during my testing process allows you to select a target gamma, and the software calculates the grayscale Delta Error based in part on how close gamma is to the target I've selected. Since 2.2 is no longer the de facto choice for all situations, reviewers may set different targets. I, for one, set a target of 2.4 when I review a projector, where the priority is dark-room performance. I use 2.2 for TVs because a dim to moderate viewing environment is more common for the vast majority of people.

But wait, there's more to the story. As I said, the system is always evolving, and the International Telecommunication Union (ITU), which is one of the standards bodies that defines the specs for the broadcast industry, adopted a new gamma spec back in 2011 called BT.1886. How is BT.1886 different from the system we've been using (which is called the Power system)? The simplest explanation is that, in the Power system, gamma is based on an absolutely perfect, zero-luminance black level, which most TVs can't actually achieve. BT.1886 actually factors in the black level that the TV is able to achieve and adjusts gamma within that scope, which results in more noticeable differences at the darker end of the spectrum - i.e., if the TV meets the BT.1886 spec, you can more clearly see the distinct steps between blacks, even if the TV's overall black level is just average. By the way, the ITU now uses the term "Electro-Optical Transfer Function" (or EOTF) instead of the term gamma, but they refer to the same thing.

Even though the BT.1886 spec has been adopted, the process of implementing it in both studio and consumer displays is just starting to gain some momentum. As an example, the Professional modes on Panasonic's new high-end LED/LCDs reportedly will default to BT.1886, and we expect to see more studio monitors doing the same. The fact that BT.1886 is not yet universal begs the question: which gamma target should calibrators and reviewers use when analyzing a display? SpectraCal has officially embraced BT.1886 as the default gamma target for Rec 709 HD displays in the CalMAN software, but I wouldn't say it has become the standard choice for all reviewers and calibrators to use. Chris Heinonen over at ReferenceHomeTheater.com recently polled some big names in the industry to find out what gamma target they use, and there was certainly no consensus, precisely because gamma can be so room- and TV-dependent. What matters most, I suppose, is that each reviewer defines a methodology and follows it with all review samples. At least for the remainder of this year, I'm sticking with my current methodology (a gamma target of 2.4 for projectors and 2.2 for TVs) and will revisit the topic at the end of 2014.

What does all this mean for you, the end user? It is hoped that this provides a better understanding of what gamma is and how your TV's gamma control affects picture quality. If you want to make sure your TV's gamma is set to best match your viewing environment, we of course recommend that you have the TV professionally calibrated. If you're not going to do that, however, here's our DIY suggestion. First, turn on your TV in the most common lighting condition in which you watch television and movies (i.e., dark, dim, or bright room), correctly set the TV's brightness and contrast using a test disc like Disney's WOW or DVE HD Basics, then grab a few movies with good black-level demo scenes. Experiment with your TV's different gamma settings to find the one that creates the best combination of black level, black detail, and brightness in your environment. If you use different picture modes for bright- and dark-room viewing, then follow these steps for each mode under those lighting conditions. At the end of the day, this simple process is another way to ensure that you're getting the best performance out of your particular display in your particular room.

Additional Resources

Subscribe To Home Theater Review

Get the latest weekly home theater news, sweepstakes and special offers delivered right to your inbox
Email Subscribe
© JRW Publishing Company, 2023
As an Amazon Associate we may earn from qualifying purchases.

magnifiercross
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram
Share to...