Disclaimer - before I continue down this rabbit hole, I want to emphasize that it is largely academic. If you buy a new TV these days, regardless of whether it is for the more clearly valid reasons of getting a larger screen, or getting the benefits of HDR and wide color gamuts, or getting a model with better black levels or better brightness than your current model, it is going to be 4K unless it is a relatively small screen. Of the 36 TV’s Best Buy currently has for sale in the 50 inch size class, only 1 is not 4K.
On the other hand, I would not buy a new TV specifically to get 4K, nor would I worry about spending extra money on 4K content unless you have a very big TV or sit very close.
Sure. And to do so, let’s all forget Jurassic Park. That was a quick anecdote, but the fact is it was a visual spectacle even in DVD resolution, and while it did look better at 1080, which helped my enjoyment to a degree, the returns diminish the further resolution increases.
If we’re going to be scientific about TV purchasing decisions, it is probably worth having recourse to scientific sources, rather than marketing sources. The marketers who plan Best Buy’s and Costco’s displays are very happy to have people inspect 6 foot wide TV’s from 3 feet away. You’re not supposed to think about the fact that your couch is 3+ times that far away, and the pixels are 3 times harder to see. The physiologists, on the other hand, have published their own work putting the resolution of the human eye at around 60 cycles per degree, and when you do the geometry on that, you get the charts in the RTings link I posted above.
https://www.cis.rit.edu/people/faculty/montag/vandplite/pages/chap_9/ch9p1.html
For a more personal scientific test, which I have also done, use high quality still images, and resize them, and view them from distances appropriately proportionate to screen size and resolution. If you correct for all of the factors, you can try this out on a regular TV monitor.
For example, I know from that if I view a 1080 image on my 24 inch, 1080 monitor from about 3 feet, it is roughly comparable to viewing on a 75 inch TV at 10 feet. It is very easy to compare DVD (480) and HD (1080) content this way by resizing a 1080 image, and comparing it to the original. Yes, you can see an easily discernible difference in this ideal circumstance, much more easily than with moving video, and you can be certain the source content is the same. Depending on your image editor (I use GIMP), you can also do what real TV’s do and re-upscale the 480 image back to 1080, reducing the perception of pixelation. You can even sharpen it, improving the edge acuity. Neither upscaling nor sharpening recover the missing detail that the original 1080 image has, but they can reduce the perceived visual deficit.
You can even simulate, if not the field of view, at least the pixel angle of a 4K screen by doubling the distance and again using appropriately scaled copies of the same image (eg - 1080 and 540 images at 6 feet on my 24in monitor have equivalent resolution to 4K and 1080 images on 75 inch TV at 10 feet). Of course, you can do this on a TV, too. It’s just not as easy to control the viewing conditions and switch back and forth.
To get to the point, what this test shows to my 20:20 vision is the RTings chart is effectively correct. I will clarify, however, that the perception of edge acuity can slightly exceed resolution. Comparing a 4K TV to a 1080 TV sized and positioned for a common 30 degree field of view won’t reveal more detail in the content, although there still might be a slight perception, under ideal conditions, or better sharpness of certain features, especially text.
Enlarge the screen or move close to replicate a 40 degree viewing angle per the THX recommendations, on the other hand, and the difference between 1080 and 4K content becomes more readily discernible.
But how many people actually have their living rooms setup like a THX theater? That would mean if your couch is 10 feet from the wall, you have a 100 inch screen