If you’re buying a TV or upgrading to a next-generation console, you’ve probably seen terms like 4K and Ultra HD thrown around. Let’s cut through the jargon and get down to what these terms mean, and if they’re even interchangeable.
It’s All About Resolution
Commonly, 4K and UHD refer to a resolution that’s a step up from 1080p (or “full HD”). A 4K UHD display has roughly four times the pixels of the previous generation, which creates a cleaner, more detailed image.
A 1080p high-definition TV isn’t able to take full advantage of a 4K UHD image. To see the benefits, you’ll need to make sure the media you’re consuming is available in 4K UHD.
Fortunately, 4K UHD is everywhere, from movies and TV shows, to the latest video games. You can also buy a UHD 4K monitor for your computer for lots of screen real estate and excellent image quality. Your smartphone probably shoots in 4K, even if the massive video files aren’t worth it on a smaller display.
4K and UHD Are Different
Despite being used interchangeably by manufacturers, retailers, and consumers alike, 4K and Ultra HDR (UHD) aren’t the same. While 4K is a production standard as defined by the Digital Cinema Initiatives (DCI), UHD is just a display resolution. Films are produced in DCI 4K, while most TVs have a resolution that matches UHD.
The 4K production standard specifies a resolution of 4096 x 2160 pixels, twice the width and length of the previous standard of 2048 x 1080, or 2K. As part of this production standard, 4K also specifies the type of compression that should be used (JPEG2000), the maximum bitrate (up to 250 Mbits per second), and color depth specifications (12-bit, 4:4:4).
Ultra HD has a display resolution of 3840 x 2160 pixels, and it’s used in the vast majority of modern TVs—even those advertised as being 4K-capable. Besides the number of on-screen pixels, there aren’t any additional specifications. The real differences between the two formats are the width of the images and the aspect ratios.
A movie produced in 4K can use an aspect ratio of up to 1.9:1, although, most filmmakers prefer 1.85:1 or 2.39:1. Video games rendered for consumer-level displays use the UHD aspect ratio of 1.78:1 to fill the screen.
This is why you’ll continue to see the letterbox format (black bars at the top and bottom of the screen) when you watch movies on your brand-new UHD television. Because UHD doesn’t specify any additional standards, older televisions with eight-bit panels are advertised as UHD sets alongside new, 10-bit (and the future 12-bit) UHD displays.
To make matters worse, Ultra HD is also used for so-called 8K content. Labeled as “8K UHD” (as opposed to 4K UHD), this refers to content with a resolution of 7680 x 4320 pixels. This leap in quality is enormous in terms of overall pixel count. However, it will be a while before we see widespread content produced for this format.
Put simply, many manufacturers use the term “2160p” to describe regular UHD content, even though it isn’t strictly accurate in relation to production standards.
Things to Consider When Upgrading to 4K
It’s a great time to upgrade to a UHD TV capable of 4K playback, as technology has matured considerably over the last five years. Not only are UHD displays now much cheaper, but they also come with more features. There are 10-bit panels capable of displaying high-dynamic-range content that also have powerful onboard image processors.
For the leap to be worth it, you’ll need to consider how large you want your display to be and how far away you sit from it. According to RTINGS, the upgrade isn’t worth it if you sit farther than six feet away from a 50-inch screen. You can’t see the pixels from that distance, anyway, so you won’t benefit from the increased resolution.
Another thing worth considering is if you even watch enough 4K content to justify the upgrade. Ultra-HD Blu-rays provide the best at-home viewing experience, and there’s a sizeable catalog of them that’s growing all of the time. If you don’t often buy expensive discs, though, you might be stuck streaming content, instead.
This is where the speed of your internet connection can make or break your investment in a shiny new TV. Netflix claims its customers need an internet speed of 25 Mbits per second or better to stream Ultra HD.
You can test your internet speed to find out how your display will fare. Remember, though, these speeds can dip considerably during busy periods (like when everyone’s streaming Netflix simultaneously).
You’ll also have to pay for a premium-level streaming subscription to access the highest quality content. Netflix gates its UHD content behind a $15.99 monthly package. This might be worth it if you’re a fan of Netflix Originals, most of which stream in UHD resolution.
Unfortunately, a lot of movies that have UHD releases are still presented in HD on Netflix.
Do you have existing HD devices, like a Roku or Apple TV? These can pose an issue, as they’re only capable of delivering a 1080p image. You’ll need a Chromecast Ultra or Apple TV 4K if you want to take advantage of higher resolution and HDR playback. This is less of an issue for your TV, as long as it has a stable and responsive OS, which many do.
Remember that 4K shines on larger displays. Unfortunately, when you upgrade to a larger native UHD TV, any 1080p content will look worse. This will be less of a problem in the future, though, and there are some solutions.
Upscaling to Ultra HD
Current TVs place a heavy emphasis on upscaling, which takes lower resolution content and scales it to fit a much larger display. Remember, there are four times as many pixels on an Ultra HDR display than there are on a regular Full HD television.
Upscaling means more than simply stretching an image, fortunately. Modern TVs and playback devices process the image and attempt to reconstruct it to look its best at a higher resolution. This is done via a process known as interpolation, during which missing pixels are generated on the fly. The intent is to produce a smooth transition between contrasting areas of the image.
As TVs become more powerful, better interpolation and upscaling techniques will be used. Currently, the NVIDIA Shield has some of the best upscaling on the market. It utilizes AI and machine learning to improve different parts of the image using different techniques.
If you upgraded to an Ultra HD TV and have noticed subpar performance with lower-resolution content, a Shield might be just what you need.
The PlayStation 4 Pro uses innovative upscaling to render images at a lower resolution (like 1,440p), which are then upscaled to 4K via a technique called checkerboarding.
NVIDIA has developed Deep Learning Super Sampling to do a similar thing on PC games. Certain parts of the image are rendered at lower resolutions, and then upscaled in real time. This offers better performance than rendering the scene in the native resolution.
What About HDR?
High dynamic range (HDR) is also often advertised on movies and TVs, and it’s an entirely different technology. While 4K is a production standard and UHD is a resolution, HDR is a loosely defined term that refers to a wider color gamut and higher peak brightness.
While 1080p HDR can exist, HDR content wasn’t widely produced during the “Full HD” age, so you won’t find any televisions on the market that offer HDR at 1080p. The vast majority of 4K sets on the market do support HDR in some form, however.
Don’t Worry About the Terminology
Whether it’s called 4K or UHD doesn’t matter. Your UHD TV is 4K-capable. The world has just adjusted to the nebulous terms thrown around by manufacturers and marketers.
Netflix might advertise a movie in Ultra HD, while iTunes labels the same movie 4K. Your TV doesn’t care and will play both just fine.
Before you head out to buy that new set, though, be sure to check out these common mistakes people make when shopping for a TV.