A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)
Personal anecdote, moving from 1080p to 2k for my computer monitor is very noticeable for games
ITT: people defending their 4K/8K display purchases as if this study was a personal attack on their financial decision making.
Right? “Yeah, there is a scientific study about it, but what if I didn’t read it and go by feelings? Then I will be right and don’t have to reexamine shit about my life, isn’t that convenient”
My 50" 4K TV was $250. That TV is now $200, nobody is flexing the resolution of their 4k TV, that’s just a regular cheap-ass TV now. When I got home and started using my new TV, right next to my old 1080p TV just to compare, the difference in resolution was instantly apparent. It’s not people trying to defend their purchase, it’s people questioning the methodology of the study because the difference between 1080p and 4k is stark unless your TV is small or you’re far away from it. If you play video games, it’s especially obvious.
It does make a difference for reading text like subtitles or navigating game menus.
If my quick calculations are correct, the 70 inches screen at 1080p has a pixel size of about 0.7 mm give or take, where 4k would be about 0.1-0.2.
0.1mm is a smallest size of a thing a human could potentially see under very strict conditions. A pixel smaller than a millimeter will be invisible from a meter away. I really, really doubt its humanly possible to see the difference from the distances a person would be watching tv.The thing is, the newer 4k tvs are just built better, nicer colour contrast, more uniformed lighting, clearer glass, and that might be the effect you’re seeing
Uh… Hol up. So if we can maybe see down to 0.2 mm and the 1080p screen has 0.7 mm pixels… That’s pretty much what I’m saying. 1080p is noticeably grainy.
The text in 4k looks crisper. I concur I can’t count individual pixels, but reading game menus in 1080p feels rougher and makes me squint. Reading in 4k feels more like reading on print paper or a good e-eeader.
This and yes, the build quality of newer screens also contributes.
This is literally the only truly important part after a certain threshold. I have a 34”, 1440p monitor and the text is noticeably better than any 1080p screen. It’s entirely legible and 4K would not provide a new benefit except maybe a lighter wallet. It’s also 100Mhz which is again beyond the important threshold.
The only time I can see 4K being essentially necessary is for projectors because those screens end up being massive. My friend has a huge 7’ something screen in the basement so we noticed a difference but that’s such an outlier it should really be a footnote, not a reason to choose 4K for anything under 5’(arbitrary-ish number).
I have friends and family with good eyesight and they can tell a difference. Sadly even with Recent prescription lenses I still can’t see a difference. Eh, at least I can save on TV’s since 1080p is cheaper.
I’ve been saying this for years.
Bullshit, actual factual 8k and 4k look miles better than 1080. It’s the screen size that makes a difference. On a 15inch screen you might not see much difference but on a 75 inch screen the difference between 1080 and 4k is immediately noticeable. A much larger screen would have the same results with 8k.
I like how you’re calling bullshit on a study because you feel like you know better.
Read the report, and go check the study. They note that the biggest gains in human visibility for displays comes from contrast (largest reason), brightness, and color accuracy. All of which has drastically increased over the last 15 years. Look at a really good high end 1080p monitor and a low end 4k monitor and you will actively choose the 1080p monitor. It’s more pleasing to the eye, and you don’t notice the difference in pixel size at that scale.
Sure distance plays some level of scale, but they also noted that by performing the test at the same distance with the same size. They’re controlling for a variable you aren’t even controlling for in your own comment.
This has been my experience going from 1080 to 4K. It’s not the resolution, it’s the brighter colors that make the most difference.
It’s the screen size that makes a difference
Not by itself, the distance is extremely relevant. And at the distance a normal person sits away from a large screen, you need to get very large for 4k to matter, let alone 8k.
You should publish a study
And publish it in Nature, a leading biomedical journal, and claim boldly.
With 44 inch at 2,5m
Sounds like a waste of time to do a study on something already well known.
Literally this article is about the study. Your “well-known” fact doesn’t hold up to scrutiny.
So I have a pet theory on studies like that. There are many things out there that many of us take for granted and as givens in our daily lives. But there are likely equally as many people out there to which this knowledge is either unknown or not actually apparent. Reasoning for that can be a myriad of things; like due to a lack of experience in the given area, skepticism that their anecdotal evidence is truly correct despite appearances, and on and on.
What these “obvious thing is obvious” studies accomplish is setting a factual precedent for the people in the back. The people who are uninformed, not experienced enough, skeptical, contrarian, etc.
The studies seem wasteful upfront, but sometimes a thing needs to be said aloud to galvanize the factual evidence and give basis to the overwhelming anecdotal evidence.
For a 75 inch screen I’d have to watch it from my front yard through a window.
Have a 75" display, the size is nice, but still a ways from a theater experience, would really need 95" plus.
Depends how far away you are. Human eyes have limited resolution.
8k no. 4k with a 4k Blu-ray player on actual non upscaled 4k movies is fucking amazing.
I think you’re right but how many movies are available in UHD? Not too many I’d think. On my thrifting runs I’ve picked up 200 Blurays vs 3 UHDs. If we can map that ratio to the retail market that’s ~1% UHD content.
I don’t know if this will age like my previous belief that PS1 had photo-realistic graphics, but I feel like 4k is the peak for TVs. I recently bought a 65" 4k TV and not only is it the clearest image I’ve ever seen, but it takes up a good chunk of my livingroom. Any larger would just look ridiculous.
Unless the average person starts using abandoned cathedrals as their livingrooms, I don’t see how larger TVs with even higher definition would even be practical. Especially if you consider we already have 8k for those who do use cathedral entertainment systems.
(Most) TVs still have a long way to go with color space and brightness. AKA HDR. Not to speak of more sane color/calibration standards to make the picture more consistent, and higher ‘standard’ framerates than 24FPS.
But yeah, 8K… I dunno about that. Seems like a massive waste. And I am a pixel peeper.
For media I highly agree. 8k doesn’t seem to add much. For computer screens I can see the purpose though as it adds more screen real estate which is hard to get enough of for some of us. I’d love to have multiple 8k screens so I can organize and spread out my work.
Are you sure about that? You likely use DPI scaling at 4K, and you’re likely limited by physical screen size unless you already use a 50” TV (which is equivalent to 4x standard 25” 1080p monitors).
8K would only help at like 65”+, which is kinda crazy for a monitor on a desk… Awesome if you can swing it, but most can’t.
I tangentially agree though. PCs can use “extra” resolution for various things like upscaling, better text rendering and such rather easily.
The frame rate really doesn’t need to be higher. I fully understand filmmakers who balk at the idea of 48 or 60 fps movies. It really does change the feel of them and imo not in a necessarily positive way.
I respectfully disagree. Folk’s eyes are ‘used’ to 24P, but native 48 or 60 looks infinitely better, especially when stuff is filmed/produced with that in mind.
But at a bare minimum, baseline TVs should at least eliminate jitter with 24P content by default, and offer better motion clarity by moving on from LCDs, using black frame insertion or whatever.
life changing. i love watching movies, but the experience you get from a 4k disc insane.
4k is definitely a big improvement over 1080p. The average person probably doesn’t have good eyesight, but that doesn’t mean that it’s a waste for everyone else.
4k with shit streaming bitrate is barely better than high bitrate 1080p
But full bitrate 4k from a Blu-ray IS better.
But full bitrate 4k from a Blu-ray IS better.
Full Blu-Ray quality 1080p sources will look significantly better than Netflix 4K.
Hence why “4K” doesn’t actually matter unless your panel is gigantic or you’re sitting very close to it. Resolution is a very small part of our perceived notion of quality.
The question for me isn’t whether or not there’s a difference that I might be able to see if I were paying attention to the picture quality, it’s whether the video quality is sufficiently bad to distract me from the content. And only hypercompressed macroblocked-to-hell-and-back ancient MPEG1 files or multiply-recopied VHS tapes from the Dark Ages are ever that bad for me. In general, I’m perfectly happy with 480p. Of course, I might just have a higher-than-average immunity to bad video. (Similarly, I can spot tearing if I’m looking for it, but I do have to be looking for it.)
It’s all about the baseline.
Cinematic, Blu Ray bitrate 1080p vs 4K is not too dramatic.
Compressed streams though? Or worse production quality? 4K raises the baseline dramatically. It’s much harder to stream bad-looking 4K than it is 1080p, especially since ‘4K’ usually implies certain codecs/standards.
Heh, I’m getting back to physical media, and this big 4K TV is literally the first time ever where I’ve actually constantly noticed that DVDs might get a bit pixely.
(And even so, I usually blame not so great digitisation. Some transfers of old obscure titles were really sloppy, you really didn’t need a great TV to see the problems. Original was a black and white movie, the DVD was a bunch of grey mush.)
You know what would sell like hot cakes? A dumb TV with Dolby Vision support. I went down the rabbit hole of finding a large HDR monitor and adapters to trick end devices to output player-led Dolby Vision to a HDR monitors, because I don’t need my TV to have a complete OS with streaming services and adverts integrated.
In the end I couldn’t find anything that didn’t have drawbacks. It’s something that could easily exist but there are no manufacturers bold enough to implement it.
Streaming tech moves so fast, I want to add it to my TV through hardware like a fire stick, not to become dependent on the TV manufacturer putting out updates until it’s ‘Out-of-support’.
I went with a TV and disabled as much of the junk as I could with a service remote and just never connected it to the internet, but jumping through these hoops seems so silly.
So dont give your tv internet access and plug in a pc.
All tvs are dumb tvs if you don’t connect them to the internet.
I get where your coming from, and I have done that, but I get constant popup reminders that “You are not connected to the internet, would you like to set this up now to access exciting apps and features?” And on top of that, I’ve had to switch to a universal remote, because the one that comes with the TV activates a cursor on screen when it senses movement which is a “feature” you can’t switch off. I just want a remote with on/off, input, and volume control 😭
What brand tv is that? It sounds horrible lol. I haven’t touched my TV remote in years since mine is controlled over cec with my Nvidia shield.
LG ☹️
I think high end smasnugs do that. But yeah same, I have a Chromecast and I just sling stuff to the TV. Its tuner hasn’t been used in a decade.
Speaking of decade… I should probably upgrade that thing. But it’s big enough, dark enough (LDC, but at least not a crappy LCD), and high enough resolution for it’s size and distance from my couch (1080, 50 inches, 10 or 15 feet) that I just can’t justify replacing it. I wish it would die already lol.
Maybe if I move and get a big enough bedroom, I’ll put it in there, and upgrade to something with HDR. I really wanna get in on some good HDR. Seems like it’s getting really good and really affordable if you buy the right thing.
I have an LG 42" 4K OLED, but I miss my 42" 720p Panasonic plasma for it’s simplicity. If it didn’t decide to die, I’d have kept it. The colours on OLED are impressive on Dolby Vision content from my Google Streamer, I’ll admit, but yea, I’m basically using my TV to display content from external hardware, the tuner, WiFi, Bluetooth etc will all be dormant it’s entire life.
I watch 576i DVDs on a 24" 1366x768 TV and I don’t mind because I sit reasonably far.
simply incorrect. in some circumstances sure 1080p is sufficient, but if the tv is big, close, or both. then 4k is a definite and noticeable improvement.
4k looks sharper as long as the actual content is real 4k, even from afar.
So completely correct as the point you are trying to make is the point the study focuses on (definition per viewed angle)
well yes a microscopic 4k display is no different than a 1080p one to our eyes.
but theyre claiming it doesnt matter on TVs in the usual setting which is just untrue.
Ok, but then 2k would usually do.
Yeah tell that to my sister, who wants 4k for her laptop simply because she’s heard 4k is better 4 times 1080p, she’s buying a 13 inch.
Small numbers are just not sufficient for some people. I know if I send this article to her, I’ll be questioned “why do you not want me to see happy?”. So instead I just watch my nephews collage fund contribution shrink.
Sorry it became a rant of family tech guy.
Eh, let her for a few weeks, but show her how to switch resolution.










