Quick Links

If you tend to focus more on aspect ratios like 16:9 and 4:3 when thinking of screen resolution sizes, then you may find yourself wondering what it going on with the popular laptop screen resolution 1366x768. Today's SuperUser Q&A post helps clear things up for a confused reader.

Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.

Photo courtesy of Cheon Fong Liew (Flickr).

The Question

SuperUser reader meed96 wants to know why the 1366x768 screen resolution exists:

I know that there is a previous question about this, but it does not have any real answers despite having been viewed 12,400 times (in addition to the fact that it has been closed). With that in mind:

Why in the world is the screen resolution 1366x768 a real thing? It has an aspect ratio of 683:384, which is the weirdest thing I have ever heard of while living in a 16:9 world.

All the screens and resolutions I am familiar with have been the 16:9 aspect ratio. My screen, 1920x1080, is 16:9. The 720 pixel size is 1280x720, which is also 16:9. The 4K size, 3840x2160, is also 16:9. Yet, 1366x768 is 683:384, a seemingly wild break from the standard.

I know there are plenty of other resolutions all over the place, but 1366x768 seems to dominate most of the mid-priced laptop world and also seems unique to the laptop world. Why not use 1280x720 or something else as a standard for laptops?

Why does the 1366x768 screen resolution exist?

The Answer

SuperUser contributors mtone and piernov have the answer for us. First up, mtone:

According to Wikipedia (emphasis mine):

  • The basis for this otherwise odd seeming resolution is similar to that of other "wide" standards - the line scan (refresh) rate of the well-established "XGA" standard (1024x768 pixels, 4:3 aspect) was extended to give square pixels on the increasingly popular 16:9 widescreen display ratio without having to effect major signalling changes other than a faster pixel clock, or manufacturing changes other than extending panel width by one third. As 768 does not divide exactly into the "9" size, the aspect ratio is not quite 16:9 – this would require a horizontal width of 1365.33 pixels. However, at only 0.05%, the resulting error is insignificant.

Citations are not provided, but it is a reasonable explanation. It is the closest to 16:9 that they could get by keeping the 768 vertical resolution from 1024x768, which had been widely used for the manufacturing of early 4:3 LCD displays. Maybe that helped reduce costs.

Followed by the answer from piernov:

At the time the first computer wide-screens became popular, the usual resolution on 4:3 panels was 1024x768 (the XGA display standard). For simplicity and backward compatibility, the XGA resolution was kept as a basis when making the WXGA resolution (so that XGA graphics could be easily displayed on WXGA screens).

Just extending the width and keeping the same height was also simpler technically because you would only have to tweak the horizontal refresh rate timing to achieve it. However, the standard aspect ratio for wide displays was 16:9, which is not possible with 768 pixels, so the nearest value was chosen, 1366x768.

WXGA can also refer to a 1360x768 resolution (and some others that are less common), which was made to reduce costs in integrated circuits. 1366x768 8-bit pixels would take just above 1-MiB to be stored (1024.5KiB), so that would not fit into an 8-Mbit memory chip and you would have to have a 16-Mbit memory chip just to store a few pixels. That is why something a bit lower that 1366 was chosen. Why 1360? Because you can divide it by 8 (or even 16) which is far simpler to handle when processing graphics (and could bring to optimized algorithms).

Make sure to read through the rest of the interesting discussion via the thread link below!


Have something to add to the explanation? Sound off in the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.