Quick Links

High-dynamic-range (HDR) video is taking off in a big way. Some of your favorite movies are already available with enhanced color and brightness, and look even better than they did in their original theatrical releases.

But some remasters have caused critics to cry foul, igniting a debate around technical capability and artistic intent.

What Are the Benefits of HDR?

Before we consider whether the term "fake HDR" is even warranted, it's important to understand what HDR video is. As its name implies, high-dynamic-range video has an increased dynamic range compared to standard-dynamic-range (SDR) content.

Dynamic range is the amount of information visible in an image or video between the brightest highlights and the deepest shadows. HDR video uses the expanded Rec. 2020 color space, which contains around 75% of the visible color spectrum. This is an improvement on the Rec. 709 standard used in SDR content, which only covers 36%.

This means more color information is visible on-screen, which is closer to what we would see in real life. More shades of a particular color also makes unsightly "banding" in gradients less prominent. The difference is most visible in fine details, like clouds or areas with subtle color variations.

HDR also adds luminance or peak brightness. The vast majority of HDR-capable TVs come with the basic HDR10 standard built-in. It stipulates that content be mastered at 1,000 nits, as opposed to the traditional 100 nits (recently revised to around 200) for standard-definition content.

This means bright objects, like the sun, a flashlight, or gunfire, can really pop when viewed on an HDR-capable display. The additional brightness makes elements like these look much closer to how they would in real life, creating a more immersive viewing experience.

HDR video is something you have to see to truly appreciate, but its improvement over SDR can be vast.

Related: HDR Format Wars: What's the Difference Between HDR10 and Dolby Vision?

What Is "Fake HDR"?

The term "Fake HDR" has been thrown around YouTube, Reddit, and other platforms in the wake of a few high-profile Blu-ray releases. It refers to the reluctance of studios to grade their HDR productions to sufficient peak brightness and make the images pop.

According to Vincent Teoh, a professional display calibrator and reviewer, the 4K Blu-ray of Star Wars: The Last Jedi hits a maximum peak brightness of 250 nits, with the sun being graded at only 200.

Teoh also found that the Blade Runner 2049 4K Blu-ray barely rises above 200 nits, making it "an SDR movie in an HDR container."

These HDR releases use a 10-bit (12 in some instances) color depth. This means they still deliver a better-quality image than SDR. However, because they lack the flashes of peak brightness shown in many other productions, some perceive these releases as "fake HDR."

As another reference, a super-bright LCD, like the Vizio P-Series Quantum X, can hit a peak brightness of well over 2,000 nits. Even LG's relatively "dim" OLED panels manage around 700 nits. Some reviewers and Blu-ray collectors feel these "fake HDR" releases have been hamstrung by underwhelming peak brightness.

This doesn't mean a film looks bad; the image just doesn't "leap off" the screen as it does in other releases. As these are major releases from some of Hollywood's biggest studios, it's clear that colorists and directors know exactly what they're doing. The reluctance to splash out on HDR effects is intentional.

Whether this validates the term "fake HDR" remains a matter of opinion, though. Blu-ray packaging doesn't include any info about peak luminance, and most buyers wouldn't understand the terminology anyway.

So, movie fans have to rely on reviewers like Teoh, who have access to HDR mastering tools, to get the whole story.

HDR Standards and Creative Intent

Two factors have contributed to the situation we covered above: the technical limitations of modern displays, and creative intent.

HDR video hasn't yet been standardized in any meaningful way. The closest thing to a baseline standard is HDR10, which now enjoys good support from both TV manufacturers and movie studios. While HDR10 as a standard is intended to be mastered at 1,000-nits peak brightness, not every TV can achieve those levels.

A display that can't hit those lofty targets will tone map an image that exceeds its capabilities. Bright elements will still be impactful, though, thanks to the contrast between the highlights and shadows. However, directors also rely on a display's ability to tone map correctly, which adds an element of risk. Will every display get it right?

The alternative is to grade your movie so it doesn't exceed the capabilities of most displays. An image that's graded more conservatively, with bright elements capped at 200 or 300 nits, will appear less punchy and vibrant. The upshot is you'll get a fairly consistent image across a huge range of displays.

The Wild West of HDR standards has also created a format war between competing technologies, like Dolby Vision and HDR10+. These modern HDR standards use dynamic metadata to help TVs adjust on a per-scene or frame-by-frame basis. Standard old HDR10 doesn't have any dynamic metadata, though, so your TV just has to decide for itself.

Dolby Vision logo

Then, there's the issue of creative intent. Some directors might decide they don't like HDR, or rather, using HDR to dazzle viewers with bright highlights. The benefits of HDR to these professionals is in color volume and accuracy, not the added luminance afforded by the latest TVs. It's worth noting, however, that plenty of directors do use HDR and peak brightness to its fullest extent.

However, it's difficult to argue against someone's creative vision. Black and white films were still produced long after color became the standard. Some directors still shoot on 35mm film or in a 4:3 aspect ratio.

Are these decisions wrong? Are viewers wrong for wondering what a movie would look like if it had been shot with all the technical bells and whistles available at the time it was made?

Food for thought, indeed!

Related: HDR Formats Compared: HDR10, Dolby Vision, HLG, and Technicolor

Movies That Definitely Are HDR

If a movie is released on Blu-ray in HDR10, Dolby Vision, or a competing format, that's about as good as you can get until the studio decides it's time for a remaster. If you're upgrading from DVDs or regular Blu-rays, the jump to 4K and a wider color gamut is still a good incentive.

Picking your favorite films based on their technical specifications is like picking your favorite books based on the typeface. It can certainly impact the overall presentation, but the underlying story, dialogue, and other elements remain the same and are just as enjoyable.

If you buy Blu-rays for their HDR capabilities, you might want to save your money and simply avoid those that fall short of your expectations. Unfortunately, there aren't many people out there with access to the professional tools Teoh uses, so information is at a trickle at this point.

For now, you'll just have to stick with watching the "good" HDR productions, like Mad Max Fury Road (almost 10,000 nits), The Greatest Showman (1,500+ nits), and Mulan on Disney Plus (900+ nits).

Shopping for a new TV to watch your HDR movies on? Watch out for these six common mistakes.

Related: 6 Mistakes People Make When Buying a TV