Dylan Seeger of Home Theater Review provides answers to common questions about Smart TVs, one of the most important purchases you will make.
On top of mentioning your price range, the salesperson should be informed on how you will use the TV. For example, will you be gaming and watching a ton of sports? Or will you mostly be watching movies and TV shows?
They should also know a bit about your viewing environment. Will the room be filled with ambient light (or sunlight from windows), or do you prefer to watch in the dark whenever possible? All these factors play into finding a TV that’s right for you, as certain display technologies are better suited to handle some of these viewing variables.
These terms denote resolution, which defines how detailed an image can ultimately be. 4K – also known by its more technically correct name, Ultra HD – has a digital resolution of 3,840 pixels wide by 2,160 pixels tall. This means a 4K television has the ability to display an image with more than 8 million unique pixels.
8K doubles the vertical and horizontal resolution, ultimately quadrupling the total number of pixels over 4K. But because 8K is currently at the bleeding edge of consumer display technology, I wouldn’t worry too much about making sure your next television has 8K resolution. It will be quite some time before 8K video content becomes widely available.
Home Theater Review examines Hisense 65-inch H9G Smart TV
For most people, the answer to this question comes down to personal preference, but there is also a bit of science involved, too. The general rule of thumb is, the farther your seated position from the TV, the larger your TV should be.
Human visual acuity comes into play here, and depending on what resolution your TV is, organizations who create video standards, like THX and SMPTE, typically recommend viewers who have 20/20 vision sit anywhere from around 1 to 1.5 screen widths away from the TV to gain most of the benefits in image detail that HD and Ultra HD resolutions have to offer. (1.5 screen widths for a 65-inch TV would be roughly seven feet.) This seating distance may be impractical for a lot of viewing environments. But don’t fret – these are just general guidelines. So, even if you're not quite at the optimal viewing distance, you’re still likely to see extra detail in the image as long as you're not insanely far away.
You can get an equally impressive viewing experience wall-mounted or not. Many people choose to wall-mount their TV for aesthetic or space-saving qualities, not necessarily for the best viewing experience. The important thing to consider when installing your TV is that wherever you decide to place it, it should be in a spot that isn't going to cause eye or neck strain. So, while a TV may look pretty hanging way up high above your fireplace, it might not be the most practical place to put it, as it might cause you discomfort viewing it for long periods of time. Choose a place not too far away, not too high up, and not installed in a spot where the sun will be shining directly on it.
What You Need To Know About Samsung 65-Inch Q70T TV
There are four consumer HDR formats currently in use: HDR10, HDR10+, Dolby Vision, and HLG. You might be tempted to buy a TV that supports all four, but I wouldn’t go out of your way or spend a ton of extra money on finding a TV that does. The fact remains that the vast majority of HDR content currently available is either in HDR10 or the more advanced HDR format, Dolby Vision. These are the two important HDR formats you’ll want to make sure your TV supports. And luckily most TVs do. HLG support is also fairly common, but this type of HDR is typically only used for live broadcast events, like sports. HDR10+ is not a widely used HDR format, and all indicators point to it remaining this way. So, don't worry too much if the TV you're looking to buy doesn't support this HDR format.
Remember the format war that pitted HD-DVD against Blu-ray back in the late-2000s? Well, something similar is happening now in the 4K era. But the format war has nothing to do with discs – it's about which enhanced HDR format will reign supreme.
HDR10 is the most basic HDR format available. Any display that claims to be HDR-compatible needs to support it. HDR10 was developed in conjunction with Dolby to act as the base HDR video format for Ultra HD Blu-ray. Dolby takes this base video and further enhances the image with its proprietary Dolby Vision HDR format through something called metadata. On Ultra HD Blu-ray, it works by injecting extra bits of predetermined data into the video signal to further enhance the HDR-effect. And because Dolby Vision's HDR processing takes place inside the display, it can alter the image in a way that better suits the real-world performance capabilities of any given display in terms of its peak brightness and color capabilities, thus making Dolby Vision the more ideal HDR format to choose from, if available.
HDR10+ was developed after Dolby Vision and in many ways it attempts to do exactly the same thing Dolby Vision does. But because HDR10+ was late to the game, it's having a much harder time gaining widespread adoption among Hollywood studios and streaming services, despite the HDR format being royalty-free, unlike Dolby Vision. Amazon Prime Video is currently the only major streaming service that supports HDR10+, and if more support for this format doesn't occur, it may end up like HD-DVD and eventually be abandoned.
HLG is a little different and will likely remain an HDR format in use no matter which enhanced format wins. That's because HLG is primarily used for broadcast television for content such as live sporting events and is compatible with both older SDR and newer HDR displays. This makes it an ideal format to broadcast in as it can work with a much wider variety of displays than any of the other HDR formats, thus reducing cost and simplifying the demands for live broadcast events. With that said, it's the least impressive form of HDR currently available to consumers.
Our Séura 55-Inch Shade Series 2 Outdoor TV Review
HDMI 2.1 is the latest standard for digital video and audio connectivity. This next-generation interconnect standard increases the data throughput rate by more than 2.5 times that of the previous HDMI standard. This means consumers will be able to pass 8K resolution video at up to 60 frames per second and 4K resolution video at up to 120 frames per second over a single HDMI cable.
But it's not just this extra data throughput that's exciting. HDMI 2.1 adds a ton of new features that a lot of consumers will find useful, such as Variable Refresh Rate and Quick Frame Transport functionality that will ensure a smoother video gaming experience with less lag.
Updated eARC functionality now supports sending Dolby Atmos, DTS:X, and other object-based audio formats to compatible AV receivers from the TV itself or devices connected to it via HDMI. And Quick Media Switching functionality will eliminate annoying screen blackouts as your TV switches between content at varying resolutions and frame rates.
All-in-all, HDMI 2.1 is a big deal. So, if you're shopping for a new TV, it's probably a good idea to make sure whichever one you choose supports it.
If you’re a gamer, choosing the right TV is important. Input lag – or the time it takes for your display to accept, process, and ultimately display the image – varies widely between TV models. Choosing a TV with low input lag (or a TV with a low-lag picture mode, often designated as a Game mode), should be high-up on your priority list if you're a gamer, as it will decrease the time it takes for your button presses and moves to be reflected in the on-screen action.
Certain display technologies, such as OLED, offer faster pixel response time over more conventional LCD-based televisions as well. Because of this, OLED TVs add in next-to-no motion blur compared to LCD TVs, thus potentially providing gamers more visible motion resolution that can be important for competitive play. So, opting for an OLED television might be a smart choice if you play a lot of games.
Gamers should also look for a TV with HDMI 2.1 inputs, as this next generation interconnect standard has added benefits for gamers, which are outlined above in the answer to another frequently asked question.
It's common for televisions to have at least one fairly accurate picture mode that requires little change from the default settings to get a great looking image. These modes are often called something like Cinema, Movie, or Reference mode, but a quick Google search of your TV model should point you in the direction of a review that will tell you which mode is best.
As far as more customized calibration goes, you might be tempted to plug in someone else’s settings you’ve found posted online, but more often than not, these settings will do more harm than good due to the fact that two TVs sharing the same model number and screen size can still have slight variations in color accuracy, etc. If you’re a perfectionist looking for the most accurate image possible, hire a certified calibrator. But going with one of these fairly accurate out-of-the-box image modes should give you a good start, and you may find that you only need to slightly tweak your main settings (contrast, brightness, etc.) using a benchmark disc or test patterns found on Netflix.
It's also a good idea to make sure that certain image enhancement features are either switched off or turned down low. Many of them, like noise reduction, motion interpolation, image sharpening, and contrast enhancement, are set too high by default. Many of these features should be disabled if you're after an honest and accurate image.