If you’ve played games on a 4K TV and thought, “this doesn’t look that much better,” you’re not alone. On paper, 4K offers a massive leap in detail compared to 1080p, but does that paper specification translate to real-world differences?
The Performance Gap Between 1080p and 4K Is Massive
To put this all in context, it’s important to understand just what it takes to render a game at 4K. Many people may not realize it, but a 4K UHD screen has four times the pixels of a 1080p display. If you’re rendering graphics in real time, that’s a massive increase in the workload your GPU has to complete with each frame.
Assuming a linear increase in the computational power needed, each frame should take four times as long to render. In practice, the render time is a bit better than that, thanks to various tricks that help some aspects of graphics rendering scale more efficiently. But no matter which way you slice it, 4K exerts a massive toll on frame rate.
If you want to keep that (native) 4K resolution but increase your frame rate, your only other option is to dial down other aspects of the image that influence render time. That means lowering things like geometric detail, draw distance, lighting quality, and other eye candy that makes games nice to look at. So you’re getting a wonderfully crisp 4k render of a more bland and basic image.
Viewing Distance Affects Your Experience
Considering that 4K games come with compromises in frame rate and visual settings, it would be sad if you also couldn’t appreciate the extra crispness you gave those features up for. Unfortunately, your viewing distance from the display may end up doing just that!
The relationship between resolution and viewing distance is not a fixed one and can depend on a variety of factors, such as the size of the display and the individual’s visual acuity. In general, however, as the resolution of a display increases, the optimal viewing distance for that display decreases.
This means that when comparing a 1080p and 4K television, the 4K TV will typically look clearer and more detailed when viewed from a closer distance. The exact distance at which the benefits of the higher resolution become noticeable will vary depending on the specific details of your setup.
This is where the concept of a “Retina” display comes into the picture. Apple uses the term “Retina” to refer to the fact that the pixel density of their retina displays is high enough that the individual pixels are not visible to the human eye when viewed from a normal distance. This makes the display appear seamless and continuous, just like the natural retina of the human eye.
This is one of the reasons there have been so few smartphones with 4K displays. At normal viewing distances and typical phone screen sizes, the human eye simply can’t discern any extra detail between the two screen resolutions. So why waste money and battery power on pixels that bring nothing of value to users?
Scaling this up to large format displays such as televisions and even computer monitors, you may be surprised how quickly your eyes can no longer see the extra detail, especially if you don’t have 20/20 vision.
Even a 30-inch 4K desktop monitor does not offer the full benefit of its resolution if you sit more than two feet away.
If we look at the highly-popular 55-inch TV size, 4K starts to look similar to lower resolutions if you’re further than five feet away.
Now, many people claim they can totally see the difference between 4K and 1080p at longer distances, and there are some caveats here. These figures are based on viewing distances that take into account how much of your visual field is filled by the screen and how many pixels per degree of vision someone with 20/20 eyesight can see. Most importantly, this is all based on watching content like movies and TV shows, not video games.
We’ll get to how that makes a difference next, but it’s indisputable that at a certain viewing distance, 4K and 1080p look the same, and you’re probably overestimating how far that distance is.
4K Has Some Benefits at Any Distance, Though
Now, a game rendering at 4K does offer benefits over lower resolutions, such as 1080p and 1440p, that are visible at any distance, even if you can’t technically see extra pixel detail anymore. A 4K image will appear more stable with less “shimmering” as you move the camera around in-game.
Many rendering elements are rendered at a fixed percentage of the output resolution, and so by having a high-res target such as 4K, you ensure a high-quality resolution for those elements. Some types of rendering, such as grass or foliage, benefit from having more “sample points” for fine detail.
We know that this benefit is visible even beyond “retina” viewing distances because you can use a feature such as NVIDIA Dynamic Super Resolution and render the game at 4K, then “downsample” it to a 1080p screen. The downsampled image will look better despite having the same pixel count as the native 1080p image.
This sounds like a strong case for 4K, but we have many other techniques, such as modern temporal anti-aliasing or various types of AI upscaling, that offer the same image quality benefits without the need to render at 4K.
Resolution, Detail Settings, and Frame Rate: Which Is The Biggest Upgrade?
Why are we targeting 4K in the first place? Largely because that’s the resolution that modern displays are capable of displaying. TV manufacturers are not in sync with the people who make GPUs, so it’s not their problem if the increase in screen resolution outstrips the pace of GPU development.
Unlike PC monitors, for 4K there are no intermediate resolutions between 1080p and 4K. It’s not an accident that so many PC gamers choose 1440p monitors for computers with GPUs much more powerful than those in current-generation consoles. 4K monitors are generally paired with to-end cards such as the RTX 3090 Ti and RTX 4090. Since this is where players with deep pockets can turn up the eye candy, get playable frame rates, and enjoy 4K resolution on a monitor less than two feet from their eyeballs.
Let’s assume you’re sitting at normal viewing distances. While a game is in motion (i.e. not in a static screenshot), you’ll likely find that higher visual settings and more fluid visuals have a proportionally larger effect on your perception of image quality than simply increasing the raw pixel count.
In the console gaming world, developers have already moved away from trying to render their games at 4K, using various upscaling methods to take a 1080p or 1440p image and making it look good or even “4K-like” on modern TVs. This doesn’t actually have as much detail as a native 4K image, but almost no one can tell anyway.