Skip to content
Tech

High Dynamic Range, explained: There’s a reason to finally get a new TV

HDR noticeably improves content that looks washed out or flat on standard screens.

Sam Machkovech | 254
In 1993, TV—and TV writing—were much different entities than what we know today. Credit: Tony Young
In 1993, TV—and TV writing—were much different entities than what we know today. Credit: Tony Young
Story text

Ever since the HDTV standard emerged in the mid-'00s, screen producers have struggled to come up with new standards that feel anywhere as impressive. That's been a tough sell, as no baseline image standard has yet surpassed the quality jump from CRT sets to clearer panels with 1080p resolution support.

3D content came and went, with its unpopularity owing to a few factors (aversion to glasses, hard-to-find content). The higher-res 4K standard is holding up a little better, but its jump in quality just doesn't move the needle for average viewers—and certainly not those sticking to modestly sized screens.

But there's another standard that you may have heard about—high dynamic range, or HDR. It's a weird one. HDTV, 3D, and 4K have all been easy to quickly and accurately describe for newcomers ("more pixels," "one image per eye," etc.), but HDR's different. Ask an average TV salesperson what HDR is, and you'll usually get a vague response with adjectives like "brighter" and "more colorful." Brighter and more colorful than what, exactly?

Yet HDR may very well be the most impactful addition to modern TV sets since they made the 1080p jump. Images are brighter and more colorful, yes—and in ways that are unmistakable to see even to the untrained eye. Content and hardware providers all know it, and they've all begun cranking out a wow-worthy HDR ecosystem. The HDR difference is here, and with this much stuff on market, it's officially affordable (though certainly not bargain-bin priced yet).

If HDR still has you (or your local retailer) stumped, fear not. Today, we're breaking down the basics of high dynamic range screens: what exactly differentiates them, how good they are, and whether now is the time to make the HDR leap. And as a bonus, we'll answer a bunch of questions about various screens and compatible content along the way.

It’s not just sheer brightness

High dynamic range boils down to a few important factors, and they're all intertwined: luminance, color gamut, and color range.

When it comes to luminance, there's something worth clarifying right away: HDR screens don't necessarily win out just by being really, really bright. What's important is the range of luminance, from the darkest dark to the whitest white on a screen.

Modern LED screens suffer thanks to their pixels being backlit, which means they have struggled to display the kind of deep, dark blacks that would make Nigel Tufnel drool. That's one reason plasma TV set owners have held tightly onto their old sets, especially now that the black-friendly plasma standard isn't being produced by any of the big manufacturers. Where an HDR set helps is by compensating with the ability to render so many more steps of luminance. That could mean an incredibly bright LCD TV or a not-as-bright OLED TV that just happens to display deeper blacks so that its luminance range is still off the charts. (For a very, very deep dive into OLED technology, read my pieces here and here.)

If you transmit video (on disc, game, or streaming service) via the current, industry-wide HDTV standard, you're capped at a luminance maximum of around 100 nits. Your screen may be brighter than that, but this is where the current standard really stinks. In that case, the signal sends its luminance information as a percentage, not a pure luminance value. It's up to your set to translate that percentage, and the results can look, quite frankly, pretty awful. This is how viewers get blown-out colors and other glaring inaccuracies.

New HDR standards not only jack a pixel's luminance maximum up but also change the encoded value to a specific number, not a percentage. That's the first step to higher color quality on your fancy TV screen. Updating the luminance differential also updates a screen's color gamut. Dolby's engineers explain how:

The problem with restricting maximum brightness to 100 nits (as in TV and Blu-ray) is that the brighter the color, the closer it becomes to white, so bright colors become less saturated. For example, the brightest saturated blue on an ordinary display is a mere 7 nits, so a blue sky will never be as bright and saturated as it should be.

With more quantifiable steps in luminance come more wiggle room for displaying ranges of saturated colors. This is actually different than a screen's color depth, which is typically described as a bit-count for color reproduction. As in, 8-bit, 10-bit, 24-bit, and so on.

Smaller numbers describe the number of bits per color, and since pixels light up with a combination of red, blue, and green data, the bit count of the three colors combined is usually used to describe the overall color quality. HDR jumps from the HDTV standard of 8 bits of data per color being transmitted. In practical terms, that's 8 for red, 8 for blue, and 8 for green, or the shorthand phrase "24-bit color." At this level, individual color data comes in a value of 0-255; multiply that out by three colors for a total range of about 16.78 million colors.

Mmm, banding.
Mmm, banding. Credit: Aurich Lawson

For decades, screen makers have felt like that was a large enough range, but higher resolutions and less CRT-related blurring have made the biggest drawback of limited color depth quite evident: banding. Look at the image above. You've probably seen stretches of a single color on a screen just like this in a movie or TV show, where the screen isn't receiving enough granular color data to fade a color naturally. 10-bit color gets us to a total color range of 1.07 billion colors (1024 per individual primary color).

There's a difference between these two color properties we're talking about. Higher color depth means less banding. Wider color gamut means more natural color representation. The latter is particularly noticeable when looking at explosive jumps in color, like a shiny red sports car or a burst of orange flame. After all, that jump in luminance doesn't mean much if we're only watching content in grayscale (though pure whites and blacks certainly benefit, as well).

Here comes another format war

With those three properties in mind, we can explore the two major HDR-related standards that have begun making their way into consumer-level electronics: HDR-10 and Dolby Vision.

It is impossible to convey the HDR difference on an SDR screen, because HDR's boosts require compatible panels. This mock-up simulates some of the effect by reducing color gamut on one side, but one of the big differences is that HDR screens don't have to add an unnatural glow around a bright point to make it look "bright." Instead, in HDR, something like the sun here has its brightness contained solely within its radius; the natural brightness of the display, and contrast with other pixels right next to it, creates a natural glow effect.

The standards have a few things in common, including support for 10-bit color depth, a jump to the Rec.2020 color gamut standard, and uncapped luminosity levels. (Current HDR-capable displays support roughly 65-75 percent of the Rec.2020 spectrum; they're more closely tuned to the DCI-P3 color gamut standard, which is still far wider than the standard found in standard HDTV content.)

Dolby Vision is technically the more ambitious format because it additionally supports 12-bit color depth and dynamic metadata. The former will, among other things, obliterate any trace of color banding—which you still might notice on images with 10-bit color depth. The latter allows a video source to refresh its baseline color and luminosity levels at any time.

These specific upgrades will pay out on consumer-grade displays to come, but their perceptible bonuses are scant in the current market. As displays creep up into luminance differentials of 2,000 nits and beyond, that dynamic metadata will allow video sources to sweep out baseline metadata in order to better favor a pitch-black look into a starry sky; an outdoor, desert scene; or whatever high-octane sequence comes next. As luminance ranges grow, so will filmmakers' desire to control those more granularly, and Dolby Vision has set up such a payoff.

But current high-end consumer displays aren't there yet in terms of luminance differentials, and it makes the Dolby Vision-specific payoff that much harder to perceive compared to what HDR-10 delivers on current screens. Plus, Dolby's standard requires a certification process and proprietary chips for both screens and media devices, which isn't going to help them win this emerging HDR format war. Right now, some streaming apps, like Vudu and Netflix, support Dolby Vision, but many apps, all high-end game consoles, and most HDR Blu-rays opt for the HDR-10 standard.

For now, just remember: if you buy a set that includes Dolby Vision support, it also supports HDR-10, but not necessarily the other way around.

Annoyingly, you won't find a clearly marked "HDR-10" logo anywhere on modern HDR sets. Instead, different set manufacturers are adopting different logos. The most common one is "Ultra HD Premium," which combines 4K resolution (3840x2160, or, four times as many pixels as a 1080p display) and the HDR-10 spec of luminance range, color gamut, and color range. These have all been "UHD Alliance certified," and some set manufacturers, including Sony, would rather not pay for the certification.

HDR content, and how well a TV set or monitor reads and renders it, is a little harder to appreciate at a fluorescence-soaked big-box retailer. That's why those certifications are important in HDR's early goings.

“I can barely type this without visibly shaking”

I found this out the hard way when testing my first "HDR" TV set: the Le-Eco SuperX55.

Le-Eco was kind enough to send a loaner set for my initial HDR testing, and technically, the set recognizes and loads HDR content whether fed from its internal Google TV-powered apps or from HDMI 2.0a sources. (HDR-10 requires the HDMI 2.0a standard to send and receive more luminance and color metadata.) When compared directly to a "SDR" (standard dynamic range) set, the difference in color and brightness range is perceptible.

No HDR-related stickers appear on the SuperX55's box, and once I finally had an Ultra HD Premium-certified TV in my testing office, I saw the difference. I didn't slack off with my choice of a certified set, by the way. I opted for the 55" LG B6 OLED TV, which has been lauded by the screen-review wizards at RTings as the finest affordable HDR-10 set on the market right now. (Yeah, go ahead and throw scare quotes on "affordable" there. Even on sale, I still had to shell out $1,799 for this beast.)

Again, we'll eventually cover the OLED-specific stuff in this set, but for now I can say that HDR effects absolutely pop on it. The luminance differential from minimum to maximum is crucial for HDR performance, and the reason becomes apparent when you watch HDR-10 content that emphasizes it.

My favorite example from my testing period was Mad Max: Fury Road, which I purchased as a UHD Blu-ray. You need a compatible Blu-ray player (I used the relatively new Xbox One S console) to enjoy the format's 4K resolution and HDR-10 metadata on a compatible screen. Fury Road isn't exactly a 4K showcase film, thanks in part to its final master only being rendered at "2K" resolution (2048x1080), but we're looking for HDR content, anyway.

In some ways, the HDR impact was more noticeable only after comparing playback on various screens: the LG B6, the Le-Eco Super4 X55, and a 2014 Vizio E-series LCD set (that one's SDR). Color differentials with bright blue skies and orange-white sands held their integrity better on the certified set by far, but other friends who were watching were more willing to take or leave the image boosts seen in average scenes.

On the other hand, we didn't need a comparison to start drooling at scenes with more dynamic and explosive bursts of light, color, and darkness. Tom Hardy's early dash through a watery, underground lair, set off with dramatic sunlight shafts, immediately looked more thrilling. And—gosh, I can barely type this without visibly shaking—the film's first high-speed truck drive through a sandstorm, complete with purple-black skies, whipping clouds of sand, and explosive lightning shafts, looked better than I remembered from the theater version. I saw Fury Road at Seattle's famed Cinerama theater, which had just been renovated with the latest in digital projection technology, and yet I felt like this version looked more exciting and dramatic.

Perhaps it's because I was sitting comfortably in my own home, or some other placebo effect, but my point stands: when your screen and media source are shaking HDR-10 hands, the highest-quality moments will blow every viewer away. The average-performance increase in luminance and color data still easily stands out.

An ecosystem of devices in a standard’s early days

It's still HDR's early days, so that handshake can be anything from a nuisance to a full-on pain in the butt.

Amazon Video currently streams a lot of HDR-10 content, including The Grand Tour, which really shines in HDR. After all, that show's all about sexy cars driving in exotic locales, and seeing a mid-day sun shining directly on a freshly waxed Porsche is the kind of thing HDR was made for. (As other car-obsessed HDR fans will tell you, some car colors can't be replicated on SDR screens.) However, Amazon Video cannot currently be loaded on any Google product (which includes pre-installed Google TV apps or the HDR-compatible Chromecast Ultra). Weirdly, Amazon has yet to release a version of its Fire TV box with HDR support, which means Amazon wants users to go outside its hardware ecosystem to access some of its flashiest content.

As of press time, the best device I've found for Amazon Video HDR-10 content is the Roku Premiere, which is a lot smarter about waiting for a full 4K signal to buffer before starting playback. This $99 box also does a remarkable job playing Netflix's HDR-10 content, although Netflix apps tend to run equally well across the device and smart TV spectrum. (I'm not sure that you can buy an HDR-equipped TV without Netflix installed, to be honest.)

Meanwhile, if you want to watch YouTube's HDR-10 content—which debuted last month and is currently limited to a whopping four compatible videos—your options are limited to select UHD Blu-ray players and the $70 Chromecast Ultra. Google TV versions of the app don't support this yet, and while the Xbox One S's YouTube app is advertised as compatible, its HDR-10 playback wasn't working as of press time. (The Xbox One S also doesn't yet play back Amazon's HDR content.) Once more HDR videos appear on YouTube, we hope HDR compatibility proliferates to devices such as Roku, WebOS TVs, and Google's other devices.

Video games have also begun dipping their toes into the HDR realm, and you may already own an HDR-10 compatible device. Sony shipped the PlayStation 4 with an HDR-10 compatible HDMI controller all the way back in 2013, before the required HDMI 2.0a spec had been finalized. If you connect that console to an HDR-10 screen, you're good to go, at least with compatible titles. While they'll only render in 1080p resolution, as mentioned above, the HDR effect is sometimes the more noticeable boost, anyway.

The new PlayStation 4 Pro console also includes HDR-10 support, along with support for 4K resolutions in compatible games and apps. The Xbox line didn't get the same automatic upgrade for older consoles. You'll have to purchase an Xbox One S to enjoy that system's HDR-10 boosts in compatible games and apps. (FYI, it also lacks 4K support in games; only apps get that boost.) However, the Xbox One S has a huge HDR-10 benefit in the form of UHD Blu-ray support, which no PlayStation system supports. At $300 (or less, depending on discounts and pack-in game offers), it's pretty much the best HDR-10 disc-playing option in town.

How console games implement HDR-10 effects really varies on a case-by-case basis at this point. Some games' contrast and color effects are underwhelming, and a few drop the ball. In particular, new PlayStation game The Last Guardian has very visible color banding when running in HDR-10 mode. Xbox exclusive Gears of War 4, on the other hand, pops anew in HDR-10 mode. Should you want to impress a friend, load one of that game's nighttime sequences, full of moonlight bouncing off of puddles and character details exploding in dark passages.

The biggest issue for HDR gaming at this point is input lag, or, how long it takes for button presses to register on a screen. Depending on your set, HDR mode can introduce an extra 10-20 milliseconds of input lag, even when "game mode" is set.

Some of Gears of War 4's set pieces look incredible in real time.
Some of Gears of War 4's set pieces look incredible in real time.

Is it HDR time yet?

That alone may be reason enough to wait for your own HDR upgrade. Some existing HDR-10 sets, including Samsung's line, keep the lag down to roughly 22 milliseconds at their most optimized HDR gaming modes; others need HDR turned off to keep input lag to a minimum. We'll be watching to see if the next wave of HDR sets can drop their brighter, more colorful modes' lag to twitch-shooter levels, or whether LG's promises of firmware updates for this year's line come to pass. (LG has updated firmware for some of its 2016 sets already, but not all of them.)

The bigger reason to wait may be current sets' luminosity differential maxing out just shy of the 1,000-nit range. That is certainly a noticeable and beautiful difference compared to the current SDR limit of roughly 100 nits, but HDR-10 content tends to be mastered in the 1,200-nit range—and can be mastered to take advantage of 4,000-nit luminosity.

We're not so crazy to assume that an upcoming 4,000-nit screen will arrive at an affordable price point in the next 2-3 years. But a differential boost in the 300-500 nit range seems likely to hit more affordable screens in that span of time, and it'll be appreciable, especially for screens that don't enjoy the pure-black benefits of OLED.

Holding out for higher-rated HDR displays—the kinds equipped to enjoy the boosts afforded by the Dolby Vision spec—is a foolish recommendation at this point, as well. That spec may win the format war, but HDR-10 is winning the current battle, and today's HDR content is being created, mastered, and distributed mostly with HDR-10 in mind.

And just to be clear, we wouldn't blame you for pulling the trigger right now. Certified Ultra HD Premium sets are on sale and up to visual snuff, with entry-level 50" models in the $900 range as of press time. The next wave of improved sets may not trickle out of next month's Consumer Electronics Show and into shops for quite some time. And while there's less compatible content than even the 4K standard, the numbers are ramping up thanks particularly to game systems. You will definitely, absolutely want to do your homework when picking an HDR set, though this year's lineup of up-to-snuff displays comes courtesy of only a few manufacturers: Samsung, LG, Vizio, and Sony. (Sony has opted to skip Ultra HD Premium certification, but independent tests have confirmed that its "HDR" displays do play nicely with the HDR-10 standard.)

HDR immediately and noticeably improves the kind of content that looks washed out or flat on SDR screens. Dark caves, sun-soaked beaches, and high-contrast scenarios look amazing when powered by HDR-10 metadata and translated properly on a compatible screen, even without an SDR screen nearby to compare to. Quality HDR-10 content currently exists, so HDR-10 set owners have a decent amount to watch right now. Because the required metadata can easily be captured on modern 4K/8K cameras, and barely affects the total bandwidth hit for sources like streaming-media apps, we expect compatible content to explode in availability this coming year, too. Buy now or buy soon, but either way, it's HDR time.

254 Comments