Apple Retina Display

By now it seems that most people on the planet have heard of Apple’s latest iPhone, the iPhone 4 which was released today.  One of the many compelling features of the new phone is the Retina Display.  When Steve Jobs first invoked this term at the WWDC, my eyebrows were raised.  Being a retinal scientist, I was immediately skeptical of just what he meant by “retinal display”.  My mind immediately raced and I wondered if it might have been some of the interesting technology I got to see on my last visit to one of Apple’s technology development labs.  I will not say anything about that visit, but this Retina Display, a super high resolution display was new technology that I had not seen before.  Essentially it is an LED backlit LCD display with a *326* pixel per inch (960×640) display (John Gruber of Daring Fireball called this resolution display back in March) where each pixel measures a scant 78μm.    Though as you can see from these images of the displays I captured under a microscope, these pixels are not square.  Rather they are rectangular and while the short axis is 78μm, the long axis on the iPhone 4 pixel is somewhere in the neighborhood of 102μm. Update 07/23/10:  After discussion with some folks, including an LCD engineer, they have pointed out that pixels are measured from center to center rather than edge to edge, so I have changed the scale bars to reflect new measurements with a micrometer. Additionally, others have emailed me noting that if the black space surrounding the pixels is taken into account, the pixels are in fact, square.  So, the measurement of 78μm for the iPhone 4 is in fact 78μm from center to center of every pixel.  Also, Ron Uebershaer sent in screenshots I’ve included at the bottom of this post that he made in MATLAB which conceptually demonstrate that the pixels are in fact square.

I am including images below of the iPhone 1G, the iPhone 3G, the iPhone 4G and the iPad to show some perspective on pixel sizes.  The scale bar and my measurements are approximate as I was having a tough time in the lab tonight finding an appropriate calibration.  Nevertheless, this should serve as a useful metric for examining the relative pixel sizes and for making the point of whether Apple’s Retina Display is marketing speak and hyperbole or if in fact, Apple’s claims have merit.

 

As you can see from this image, the iPhone 1G pixels (each composed of a red, green and blue sub-pixel) measure approximately 150μm x 500μm.  Also note the blurryness of the image.  This was optimally focused, but the LCD panel itself is behind a non-bonded pane of glass with touch sensor on it leading to some image degradation.

 

As in the 1G iPhone, the iPhone 3G pixels are essentially the same size, though with a different contact location.  Again, these pixels measure approximately 150μm x 150μm and this LCD display has the same blurring issues that are present in the iPhone 1G.

 

This image of the iPhone 4G LCD is made at the same magnification as the 1G and 3G iPhones illustrating the substantially smaller pixel size in the iPhone 4G.  These pixels are remarkably small and if you look carefully, appear to be composites themselves where each sub-pixel is composed of its own sub-pixels.  I am not sure about this however and it may simply be an artifact of the construction.  Also note that there is very little distortion in the pixel images as the iPhone 4G has a bonded glass cover, eliminating the space in between the LCD panel and the touch sensitive glass surface.

iPhone1: ~150 x 150μm

iPhone 3G: ~150μm x 150μm

iPhone 4G: ~78μm x 78μm

So… the claim from Steve was that this display had pixels that matched the resolution display of the human retina.  Now, fan of Apple that I am, this struck me as perhaps a bit hyperbolic, so I figured I’d do some quick calculations to see where this claim fell.  Apparently I am not the first Ph.D. to wonder as another came out calling the bluff of Mr. Jobs.  Here is the deal though… While Dr. Soneira was partially correct with respect to the retina, Apple’s Retina Display adequately represents the resolution at which images fall upon our retina.

Essentially, this is a claim of visual acuity which is the ability of the visual system to resolve fine detail.  There are an awful lot of considerations to take into account when making such a claim such as contrast, distance, the resolution of the display and some metric of pixel size which gives you an estimate of visual resolution on the retina.  Claims of contrast ratios are notoriously flexible in a number of displays and will be influenced by a number of optical factors as well as the content being viewed and the black and color levels of the pixels as well as overall luminance.  Apple claims an 800:1 pixel ratio and I’ll take them at their word on that and focus on the claims of resolution here.

A “normal” human eye is considered to have standard visual acuity or 20/20 vision.  This means that a 20/20 eye can discriminate two lines or two pixels separated by 1 arcminute (1/60 degree).

The ability of an optical system to resolve fine detail requires minute spacing of optical detectors.  In the retina, there detectors are the photoreceptors.  Objects we look at at projected through the cornea and lens and imaged on the back of the eye on a plane that ideally lines up with the retinal photoreceptors.

Theoretically the limit of retinal resolution, say the ability to distinguish patterns of alternating black and white lines is approximately 120pixels/degree in an optimal, healthy eye with no optical abnormalities.  Again, this corresponds to one minute of arc or 0.000291 radians (π/(60*180)).  If one assumes that the nominal focal length of the eye is approximately 16mm, an optimal distance from the eye for viewing detail might be around 12 inches away from the eye which is reasonable to assume for someone viewing detail on their iPhone.

Dr. Soneira’s claims are based upon a retinal calculation of .5 arcminutes which to my reading of the literature is too low.  According to a relatively recent, but authoritative study of photoreceptor density in the human retina (Curcio, C.A., K.R. Sloan, R.E. Kalina and A.E. Hendrickson 1990 Human photoreceptor topography. J. Comp. Neurol. 292:497-523.), peak cone density in the human averages 199,000 cones/mm2 with a range of 100,000 to 324,000.  Dr. Curcio et. al. calculated 77 cycles/degree or .78 arcminutes/cycle of *retinal* resolution.  However, this does not take into account the optics of the system which degrade image quality somewhat giving a commonly accepted resolution of 1 arcminute/cycle.  So, if a normal human eye can discriminate two points separated by 1 arcminute/cycle at a distance of a foot, we should be able to discriminate two points 89 micrometers apart which would work out to about 287 pixels per inch.  Since the iPhone 4G display is comfortably higher than that measure at 326 pixels per inch, I’d find Apple’s claims stand up to what the human eye can perceive.

 

For reference, I am also including an image of the iPad LCD taken at the same magnification as the iPhone images above.  As you can see, the pixel size is actually much larger and herringbone shaped which is not uncommon in high quality desktop displays like say, the Apple Cinema Display line.

 

 

Update 03/02/11:  Carles Mitjá has an entry with a proceedings citation highlighting image quality expectancy here.  He has three beautiful images of a MacBook Pro 15″ display, an iPhone 4 display and a very interesting 24″ Apple Cinema Display.

Update 08/24/12:  Looks like this article has resulted in my being quoted in the NYTimes for an article on choosing computer displays.

Update 12/15/13: Linked from an NBC News article on 4k televisions, Enough pixels already! TVs, tablets, phones surpass limits of human vision, experts say.

 

164 Replies to “Apple Retina Display”

  1. Absolutely fascinating – Thanks for taking the time to both share your thoughts and putting them into a perspective sub-scientists can grasp.

    I’d like to see you give similar treatment to high-end sensors sometime – I bet it’d be equally intriguing and educational too.

  2. Nice! Thanks for the awesome insight into the retinal scientist’s perspective on this. I was curious what you’d think about the screen on the iPhone 4 when it was released, and I knew you’d have one. ;)

  3. When I read the previous article by Dr. Soneira I was curious as to your thoughts. Thanks for this detailed, expert insight! I would expect nothing less from your analytical mind.

  4. Scientists can’t but argue over claims… :)

    With best intentions: “So, if a normal human eye can discriminate two points separated by 1 arcminute/cycle at a distance of a foot” is schoolbook approximation with intention for students to remember order of magnitude of this property without struggle. Obvious hint: if true it just happens that at one unit of distance measure (we invented) resolution (evolved by Nature) is defined by another unit of angle measurement (which we also independently invented)… Not likely.

    To the substance: “However, this does not take into account the optics of the system which degrade image quality ” – Nature is not wasteful during evolutionary process. Our optics (in a healthy person) are better than our sensor array. Though very close, “better” means that evolution would degrade our sensors (same as flightless birds lose ability to fly when not needed over generations) if they were collecting info optics couldn’t resolve (and vice-versa). Nature is incredible engineer.

    So, while 0.5 arcminute/cycle at one foot is still approximation, it is much closer to the real number and ability, which makes at least factor of two in Apple claim vs. properties of human eye.

  5. hi, I would like to add (and maybe you can shed insight) that it really isn’t the pixel dimension that is important for deciding this topic, but the distance between the pixels. that distance being much more important, (only in the since of someone trying to determine if they can see a pixel with their super human eye)… not in how great the display is, but strictly for this topic.

    the two pixels side by side would not show up in your vision, if the distance between them is so small that you can not define it… that distance between pixels is even smaller than the measurements being used.

    in otherwords, if a person put on Glasses that gave them 20/10 vision, according to your calcs, they should see the pixels… i’m assuming you had a better microscope than this :0)

    and I’m contending that a person with glasses that gave them 20/10 vision and at 12″ away, still would have a very very tuff time seeing a jaggy edge, or the edge or jaggies in a photo..

    that seems like a simple test, if you can get a person calibrated correctly.

  6. Doesn’t this completely ignore the ability of the brain to perceive higher detail by coalescing data as the eye shifts slightly on a continual basis?

    While the retinal calculations above would be true if our eye took photographs, this is only a part of the system as other researchers have shown. Actual acuity of vision is substantially higher in some cases.

    Also notable is that these fixed RGB pixel layouts are not optimal for conveying image data to the eye because of difference in sensitivity to the component colours, but that isn’t a resolution issue per se.

  7. I am very curious about this but no one has ever been able to tell me the answer. But I bet that you can. In the iPad photo above, the light from blue pixels bleeds wider than the object in a kind of halo effect. Why does that happen and why doesn’t the red or green light do that?

  8. A very interesting article.
    A minor point, though: I am a bit confused by your statement of the pixels in the iPhone 4 not being square: The area covered by the red, green and blue diodes (horizontal dimension of any diode and distance from top edge of green to bottom edge of blue diode) are not square. However, isn’t the pixel comprised of the three diodes and the empty space between them (the rectangle spanning from the left/upper edge of a diode set to the next)?

  9. Am I confused, or are you trying to define the resolution in terms of the emissive area of a pixel rather than the distance between the discrete elements? The pixels on all these devices are square – it’s just that not all the pixel area emits light.

    I’m also a bit confused by your scale – by my measure, using the 180 micron line, the pixels are about 120 micons square. That would make the long edge of an iPhone 4 screen 4.5″, which it isn’t (it’s about 2.9″). The top left of a green subpixel to the top left of the next should be about 77 microns (both horizontally and vertically) in order to fit a 960×640 screen into the iPhone’s form factor. Are you sure of your measurements? If so, Apple have lied about the resolution, and I’d have thought someone would have noticed by now.

    There’s a lot of maths about the extinction of resolution. I claim I can see the pixels on my old 310ppi phone, I can see the pixels on a 360dpi inkjet print-out, and I therefore expect to see the pixels on a Retina Display.

    Sorry if this sounds aggressive – I just don’t see how the numbers add up.

  10. How do you square this with claims that humans can easily distinguish 1200dpi from 2400dpi images [http://www.edwardtufte.com/bboard/q-and-a-fetch-msg?msg_id=00002G&topic_id=1&topic=Ask+E.T.]?

  11. Thanks for the comparisons and insight. I’m especially intrigued by the iPad’s herringbone pattern, and wondering why the green pixels have a uniform shape/clarity, while the red seems a bit fuzzier, and the blue is blurry. Plus blue and red have the extra notch at the top.

    Is the blurriness just a fluke of the imaging equipment, or is it intended by the manufacturer to soften the blues?

  12. Empirically, it’s easy enough to display a pattern of alternating pixels, and see whether your retina can pick them out. Here’s one such pattern:
    http://quezi.com/12935

    I don’t have an iPhone, but the pixels are just at the limit of my vision when viewed on my 267 pixels per inch Nokia N900, so I’m fairly sure I wouldn’t see the pixels on the iPhone 4 display.

  13. ————————
    So, while 0.5 arcminute/cycle at one foot is still approximation, it is much closer to the real number and ability, which makes at least factor of two in Apple claim vs. properties of human eye
    —————————-

    uhh, someone obviously hasn’t seen an iPhone 4… if this were true you would be able to see the pixels from 12″ (assuming you have have very good vision)

    why is it that you can’t if the visual arc was .5? a simple test would have told you this was an incorrect number in real life.

    people have taken close up photos of the actual screen in action with micro lenses from 3″ away… with glasses that correct my vision to much better than 20/20, I could barely discern the pixels, but i could.

    this number of .5 arc is bogus in real world applications where the LCD screen is using anti-Alaising to vary the contrast between pixels, and eyes with photoreceptors in a jumbled pattern that do not line up with the gridded spaces between pixels.

    it is just plain bogus number of .5, and simple test with the actual equipment (the actual screen) shows this.

  14. “…peak cone density in the human averages 199,000 cones/mm2”

    This is incorrect. If you are going to nitpick you might as well be correct. This number refers only to cone outer segments. The Curcio et al., paper from 1990 makes the same mistake.
    theorem?

  15. Surely if the limit of ocular resolution is 1 _cycle_ per arc-minuted then you need 2 _pixels_ per arc-minute to reach this limit, since there must be an on and an off pixel in each cycle. This suggests that Soneira’s claims are right.

  16. ———-
    by my measure, using the 180 micron line, the pixels are about 120 micons square
    ————-

    to answer your question, by your measure you are wrong, (obviously since it doesn’t add up)

    so from that point on it is hard to believe any of the rest, including the ability to see a 310 ppi display, when you can’t tell the difference between the size of the pixel shown and the 180 sized line, which is right on top of it.

  17. Today I learned that the pixels in the iPad aren’t rectangular.

    What did you use to take these shots, Bryan?

    -jcr

  18. If you look at text on an iPhone 4, it looks just like text in a magazine. It is amazing.

    People are going to be clamouring for this on… slates and medium to high-end laptops.

  19. It seems to me that if you defined a pixel as a set of three glowing color elements _and_ the black areas between them, the pixels would be very close to square, with about 200 such squares per inch.

    I thought the 1G iPhone display had amazing detail, so twice as good as that….

  20. Great job breaking this down! The high quality closeups are amazing! (Reading this on an iPhone 4, btw, and it looks very crisp!)

  21. It’s also of note that there’s a point where you go cross eyed. :)

    Even if my eyes could see the pixels under 6-7 inches from my eye, it’s impossible for me to focus.

  22. A small technical note… the latest iPhone is called the iPhone 4, not the iPhone 4G. It is not a 4G phone. Also the original iPhone was simply called the iPhone, not the iPhone 1G. It was, in fact, a 2.5G phone. 1G phones were analog.

  23. Excellent science, thank you!

    One note: the name of Apple’s product is the iPhone 4, not the iPhone 4G. It is often misspelled as the latter, leading many to believe it can move data at 4G network speeds. It is not advertised as being able to do so.

  24. Thanks for the very sharp images and thoughtful analysis!

    I assume it’s correct that the pixels, whatever inter-spacing you may find, are laid out exactly on a square grid. (Otherwise, programmers could have fits in trying to draw correctly-proportioned images, as happened in the bad old days of Win 3.)

    As to the iPad-v-iPhone comparison, all the Apple ads show the iPad used on a *laptop*. (Slight irony there.) For me, that’s maybe 20-24 inches away, so same magnification pixels would look half as big. Could we seen how resolved pixels would look at equivalent resolution by the eye?

    I also wonder whether your talents might be well applied in contrasting the fine detail of other smartphones. Several blogs have commented on the “fuzzy text” of the hi-rez Nexus, apparently from its reduced sub-pixel count; Dr. Soneira also opined that software drivers fuzzed the images to reduce moiré. I would think that iPhone4 vs DroidX is a more likely comparison than vs iPhone1, in terms of consumers’ concerns.

  25. Fascinating indeed. Thank you for taking the time to do this, Very nice pictures, I have wondered how the subpixels were arranged. Also the blurriness of earlier models really showed.

    But being able to look at these pictures makes me wonder exactly how you come up with your numbers. Looking at the Retina pictures, they look square enough to me.

    I tried the best I could by measuring and doing the math, and found they (the iPhone 4 pixels) are on average 59.6×57.3 (WxH) pixels of your image. Your 180 um scale is 87 pixels wide. So the Retina pixels actually are 123×119 um. That should be the size (give or take) of a complete Retina pixel, including of course the black interpixel spacing.

    You could ignore the spacing when talking about the pixel construction (which is interesting enough), but not when comparing to the resolution of the eye.

    Also I wonder how you came up with the 500 um pixel height for the older displays. I get 237 um. I’m not saying you’re wrong here, I just wonder why my results differ from yours.

    Nevertheless, I still think it’s a very interesting article. Thanks again!

  26. honkj:

    I wrote:
    ———-
    by my measure, using the 180 micron line, the pixels are about 120 micons square
    ————-

    You wrote:
    ————-
    to answer your question, by your measure you are wrong, (obviously since it doesn’t add up)
    ————-

    Well, *something* is wrong. Looking at the bottom iPhone 4G image, the “180um” line is immediately above about 1.5 green pixels, measured from the bottom right corner of one to the middle of the bottom of the next. (I measured more accurately by scaling everything in an image editor, but it’s pretty close to 1.5.) That’s 120 microns along a pixel edge (some of which is not illuminated). 960 pixels is therefore 115.2mm, or 4.5in. The iPhone’s screen has a 3.5in diagonal, and is therefore not 4.5in along an edge. Either the image isn’t from an iPhone or the “180um” line is incorrect. I assume its an honest mistake, and it’s the latter – the 78x102um measurements seem to tally with this alleged 180um line, assuming the black space between pixels is (for reasons I consider incorrect) excluded.

    Put another way, let’s go with the claimed 102um pixel size, as the longer dimension. A 3.5in iPhone screen is 2.91in by 1.94in, or 74.0mm x 49.3mm. 102um x 960 pixels = 97mm, 102um x 640 pixels = 65mm, both of which are bigger than the possible screen.

    The quoted figures aren’t used when comparing the 326ppi official iPhone figure with visual acuity, so that conclusion may still be correct. However, the numbers, and the labelling on the image, cannot be. Sorry, just trying to help.

    ———-
    so from that point on it is hard to believe any of the rest, including the ability to see a 310 ppi display, when you can’t tell the difference between the size of the pixel shown and the 180 sized line, which is right on top of it.
    ———-

    It’s possible that my comfort zone for viewing a 3in WVGA screen is less than 12in – in other words, I would look closely enough at an iPhone screen to be able to see the pixels, even if you can’t see them from 12in away – I just do hold a small screen that close. However, in the absence of a good test case, I can definitely see 310ppi lines at a viewing distance of at least 12in, even if it’s not comfortable. I’m not making assertions based on alleged measurements or making claims for or against the Retina display – just reporting what I can see. It’s been a long time since I looked at ink-jet printout, so I can’t vouch for how far away I had to hold it.

    My vision is pretty poor (which is why I hold a phone close), but many people with poor distance vision have much better vision at short range, so the 20/20 get-out clause is a bit simplistic.

    Honestly, I’m looking forward to the Retina Display appearing in an Android phone (eventually). I like the display, I’m a great fan of resolution, I happen not to be a fan of the iPhone from a developer’s perspective. Apple (and LG) are to be commended for producing it, but making preposterous claims about it seemed unnecessary. It’s a big jump over an original iPhone display, but it’s only an incremental improvement over the WVGA 3-4in Windows Mobile and Android devices that have been out for years – it’s a nice feature, but it’s not magic.

  27. Note that if you take the three sub pixels in concert with the black “dead” row next to them, they do roughly make up a square pixel. This is why computers are referred to having square pixels (whereas, I believe, in video, pixels are based on rectangles.

  28. I apologize if you’ve answered this above, but does it make a difference that we’re talking about emitted light rather than reflected light? It seems like the ability to discriminate individual pixels would also be largely affected by the brightness of the individual pixel– the more brightness, the more likely it would be to mix with the light from the pixels around it?

  29. Great article! I’m not a Retina Neuroscientist, but I am a Cataract and Refractive Surgeon—and I love the new iPhone 4. The Retina Display is gorgeous, and the only negative is that now I’m not liking my iPad and MacBook Pro displays as much!

    However, if I look closely, I am able to see the pixelation in the “Retina Display.” I think this is explained by the fact that many healthy eyes are actually able to see 20/15 and even 20/10; i.e. up to twice the resolution that you used in your calculation. Therefore, to exceed the resolution of the human eye (cornea, lens, retina, and all!), a screen would need 287 x 2 = 574 pixels per inch.

  30. It’s the iPhone 4 … not 4G. But don’t get me wrong, I blame Apple for not having a consistent naming scheme for things. I have so many friends who call the “Touch” the “iTouch” … In my opinion, it’s a failure in Apple’s marketing. If you go to google and type “iphone” followed by a space, you’ll see “iphone 4g” as the first suggestion, so it’s apparently an epidemic.

    Anyway, I loved the technical information on the screen. I’m still waiting for my iPhone 4 to be shipped. For some reason I wasn’t able to order it on the 15th of June … :)

Leave a Reply to Pete Austin Cancel reply

Your email address will not be published. Required fields are marked *