Quick Links

With high dynamic range (HDR) now coming as standard on most new TVs, you might have heard the term "peak brightness" used to describe display performance or image quality. So what is peak brightness, how is it measured, and what does it tell you?

Measuring a Display's Peak Brightness

Peak brightness refers to a display's maximum rated brightness. Due to the way that some displays limit full-field brightness, there are a few ways of interpreting this value. Since it's a measurement of luminance---or the total brightness emitted from a display---peak brightness is measured in nits or candela per square meter (cd/m²).

Peak brightness can be measured in "real scene" and "window" values. The real scene value is the maximum brightness attainable by a display while watching video content. Reviewers will usually use the same reference footage to compare one display to another, providing a real-world comparison of overall display brightness.

Then there's peak brightness on a window, which covers only a percentage of the screen. For example, a 2% peak brightness window measures the maximum possible brightness possible in a short time over 2% of a screen's total surface. This is usually measured by displaying a white box on the screen.

Window tests are especially useful for examining how well a display will deal with bright HDR highlights, such as a flashlight on the screen. You might also see "sustained window" tests, which test for a longer (sustained) duration. This is useful because many displays will continue to dim the longer a bright highlight is held on the screen.

Peak brightness applies to both HDR and SDR content but is most useful when comparing the much brighter highlights often seen in HDR content. TV review website RTINGS is an excellent source of display information, with a comprehensive list of peak brightness values for all displays that have been tested.

Display Technology Makes a Big Difference

Some displays can get much brighter than others due to the underlying technology, but this doesn't necessarily result in a higher quality image. For example, LED-lit LCDs get a lot brighter than their OLED counterparts. This makes them especially suited to brightly lit environments like sunny living rooms.

Due to the organic nature of OLED displays, manufacturers use an aggressive auto backlight limiter (ABL) to prevent damage to the screen due to heat build-up. This is most noticeable in full-field, bright scenes like a solid white background. On an OLED, smaller areas of bright highlights can still hit the levels required for an impressive HDR presentation.

LG G1 OLED Evo
LG

While your viewing environment should factor into your TV buying decision, try not to place too much value in peak brightness alone. Many bright LCD models suffer from poor contrast ratio, disappointing black levels, and ghosting from dimming algorithms.

OLED models can't get anywhere near as bright, which makes them unsuitable for brightly lit environments, but they have much better black levels and an "infinite" contrast ratio since pixels can be switched off completely.

You should make sure that you've done your research before you buy a brand new TV.

Related: How to Buy a TV: What You Need to Know

Directors Decide How Bright Their Movies Get

Lastly, don't forget about the director's intent. Many directors are resistant to the idea of over-using HDR and often release their movies with relatively few eye-catching highlights.

To put it another way: A movie that is graded to only hit 300 nits won't surpass that value even when viewed on a production-quality reference monitor that's capable of over 1,000 nits.

Though HDR has been embraced by many studios, so-called "fake HDR" releases exist.

Related: What Is 'Fake HDR,' and Should You Buy HDR Blu-rays?