

If 4k is 4k because the horizontal resolution is around 4000, so you’d think 1080p, with its 1920p-long lines would be 2k. It’s fucked that it isn’t.
he/they


If 4k is 4k because the horizontal resolution is around 4000, so you’d think 1080p, with its 1920p-long lines would be 2k. It’s fucked that it isn’t.


Here’s the gut-punch for the typical living room, however. If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish.
That seems in line with common knowledge? Say you want to keep your viewing angle at ~40º for a home cinema, at 2.5m of distance, that means your TV needs to have an horizontal length of ~180cm, which corresponds to ~75" diagonal, give or take a few inches depending on the aspect ratio.
For a more conservative 30° viewing angle, at the same distance, you’d need a 55" TV. So, 4K is perceivable at that distance regardless, and 8K is a waste of everyone’s time and money.
They shouldn’t use numbers at all tbh. QQVGA, QVGA, VGA, q(uarter)HD, HD, Full HD, QHD, UHD and so on works for all aspect ratios, and you can even specify by adding prefixes like FW (full wide) VGA would be 480p at 16:9. It gets a little confusing cause sometimes the acronyms are inconsistent (and PAL throws a wrench on everything), but the system works.
PS: I also don’t like that 540p is called qHD cause it’s a quarter of Full HD.