Unless you’ve been living under a technology rock (which I guess would just be a normal rock), you know that the Consumer Electronics Show is happening this week. And every time that CES happens, we’re inundated with announcements of new TV’s with fancy cutting-edge features: 3D, 4K, OLED, LMNOP, QWERTY, and even lasers.
This year, the buzz seems to be around 4K TV’s. Also called Ultra HD, these are TV’s capable of 3840×2160 resolution – four times as many pixels as 1080p. Let’s ignore for a second (or a year or two) that it’s almost impossible to actually get native 4K content – more pixels are always better.
Well of course, as with any technology, that depends on a wide range of variables. And of course, as with everything, the Internets have chosen to ignore these real-world variables and instead argue about one tiny detail. The argument centers around whether or not we as human beings can see a difference between 1080p and 4K .
You’re asking the wrong question.
How Well You See
The standard for visual acuity has, for many years, been that the smallest observable detail for someone with 20/20 vision is one arcminute – that equates to sixty pixels per degree. That 20/20 E on the eye chart? Each stroke is one arcminute wide, as are the gaps.
A study by the BBC re-affirmed this in 2004. Using this standard, the following chart shows the combinations of viewing distance and screen size where different resolutions are noticeable.
However, videophiles love citing a study from Japanese broadcaster NHK that supposedly proves our visual acuity can be much better than 60ppd – up to 200, in fact. Here’s a 200ppd chart from AVS Forum.
The problem: they’re both right.
Apples and Oranges
Let’s look a little closer at the NHK study that everyone’s yelling about. From the introductory paragraph:
Various aspects should be taken into account when determining UHDTV specifications. Of these, the authors believe that human factors, such as how we feel when viewing a video, are some of the most important aspects to be considered so that the system achieves the intended psychological effects. Several research projects have been carried out at NHK’s (Japan Broadcasting Corp.) laboratory, in accordance with this idea. They include dependence of sensation of presence on visual angle, both subjective and objective; required angular resolution based on resolution discrimination threshold and sense of realness; negative effects of widescreen video (e.g., motion sickness); and dynamic visual acuity when viewing a wide-angle video.
Now let’s look at where the previously-established 60 ppd came from – the Snellen chart for measuring visual acuity:
Visual acuity (VA) is acuteness or clearness of vision, which is dependent on the sharpness of the retinal focus within the eye and the sensitivity of the interpretative faculty of the brain.
Notice anything? Perhaps that they’re measuring completely different things?
One is looking at sharpness of detail. One is looking at feelings of realness and presence. Not at all the same thing.
The NHK study has only a small section on differentiating image resolutions. In this section, the average visual acuity of the subjects was 20/10 – this should equate to 120ppd, or 60 cycles per degree (cpd). Here are the results of this section of the study:
…the limit values […] were 60 to 70 cpd. This corresponds closely to the average visual acuity (minimum separable acuity) of the participants.
Rather than disproving the previously established threshold for image resolution differentiation, NHK’s results are almost exactly in line with what we’d expect. So where’s the argument?
Down the rabbit hole we go…
Hyperacuity and Our Crazy Brains
The NHK study was primarily about perception – a very different concept than vision, though clearly related. There comes a point where the physiology of our senses ends and psychology takes over. In vision, this is called hyperacuity.
The sharpness of our senses is defined by the finest detail we can discriminate. Visual acuity is measured by the smallest letters that can be distinguished on a chart and is governed by the anatomical spacing of the mosaic of sensory elements on the retina. Yet spatial distinctions can be made on a finer scale still: misalignment of borders can be detected with a precision up to 10 times better than visual acuity. This hyperacuity, transcending by far the size limits set by the retinal ‘pixels’, depends on sophisticated information processing in the brain.
So even though there is a limit to what we can see, there’s no real limit to how we see it. That’s the beauty of the brain. And that’s what NHK’s study was actually driving at: that we’re at a point in display technology where accuracy is almost moot, and we can now focus on perception. The bulk of the study is devoted to measuring feelings of presence, equilibrium reactions, and motion sickness: ie, how to make it feel real.
The resolution is just a means to an end.
The Actual Results
NHK’s goal was to come up with standards for a new UHD format. And in August 2012, the ITU posted Rec. 2020 – a specification defining ultra high definition television. They’ve recommended a framerate of 120fps – though a wide range is supported – and a color depth of 10 or 12 (most stuff is at 8 now). Additionally, NHK found that people prefer to view 4k at a distance of 1.5x screen height. In a standard American living room with a 9ft viewing distance, that means a 147 inch TV.
The endgame of all this is to create an immersive, present, and realistic viewing experience. Fill the viewers’ field of view with a screen, fill that screen with images indecipherable from reality, and you’ve got something that more closely resembles a window than an “idiot box.”
Back to Reality
But what does all this mean for the average Walmart shopper?
At the end of the day, there’s really no argument here. 4K will become ubiquitous, just like 3D before it, and it won’t even be a factor in the television purchasing decision. The age-old advice remains: buy the biggest and best TV you can comfortably afford.
It is, however, an interesting step forward in display technology. For the first time, images can strive not just for accuracy, but for reality. The question shouldn’t be whether or not we can see a difference.
Can we feel it?