Quick Links

Thinking of buying an 8K TV? There's very little native content available, so upscaling is likely to be your best friend for several years to come. Is upscaling good enough, or should you wait? Let's take a look.

Most Content on an 8K TV Is Upscaled

Upscaling is the process of taking lower resolution content and optimizing it for display on a display with higher pixel density. Old upscaling techniques used rudimentary pixel doubling, to simply "blow up" the image. Since this content was never designed for larger, more pixel-dense displays in the first place results were usually disappointing.

Related: What Is "Upscaling" on a TV, and How Does It Work?

Over the years upscaling techniques have improved significantly, with the latest technology leaning heavily on machine learning. This technology uses artificial intelligence (AI) to identify objects and apply context-specific enhancements to the image.

This has been achieved through using powerful system-on-a-chip hardware that is now standard on most high-end (particularly 8K) displays. The higher resolution content you feed your TV, the better the results. 8K TV owners should aim for 4K as a baseline.

This technology not only improves edge-sharpness and clarity in a way that pixel doubling does not, but it also improves lighting and texture reproduction. Different optimizations can be applied to complex textures like grass and skin in a way that older techniques simply cannot match.

Related: What Is a System on a Chip (SoC)?

This means that early 8K TVs are considerably better at upscaling than early 4K TVs were. It's hard to quantify the difference, but if you're interested in an 8K TV then it certainly can't hurt to head to a showroom and ask for a demonstration.

How Much Does Native 8K Content Matter?

How much does higher-resolution content matter? Smartphones are used more than any other device for watching YouTube videos, which says a lot about what we value most when it comes to modern video content. Often, convenience of access seems to take priority over pure image quality since we've hit a point where quality has passed the "good enough" mark for most tastes.

Related: Create Advanced Panoramas with Microsoft Image Composite Editor

If you're old enough to remember the days of long play VHS recordings and analog TV broadcasts, you've seen a change in video quality far greater than the move from 4K to 8K. Many argue that the arrival of HDR video has seen a far greater improvement in image quality than the leap from HD to 4K.

It's going to be a long time before we see widespread adoption of native 8K content. Upscaling is going to be the dominant way that content is consumed on these displays for years to come. No current-generation games consoles can yet output in 8K, and even PC gamers are better off settling for lower resolutions and higher frame rates given the incremental difference in quality between 4K and 8K on most displays.

Upscaling techniques will improve as AI-driven system-on-a-chip hardware becomes more powerful, and as the companies who manufacture them better train and refine the algorithms that drive them. That may be worth waiting for if you're not itching to get your hands on an 8K set right now.

Future Display Technologies Are More Exciting Than 8K

8K is neat, but the underlying display technologies that will drive future displays are much more exciting. Current 8K TVs use LED-LCD and OLED panels, which each have their benefits and drawbacks.

LCD panels rely on local dimming to improve lackluster black reproduction but can hit impressive levels of peak brightness. OLED is a self-emissive technology, which means "perfect" blacks at the cost of dazzlingly bright highlights.

These technologies will likely be replaced by MicroLED, which is currently in its infancy but claims to solve many of the issues with current sets being both self-emissive and much less prone to burn-in.