Skip to content

iPhone 6 Pixels


The first time you switch on an iPhone 6, you will be amazed at how clear the display is.  It looks even higher definition than the iPhone 5s which is a pretty nice display itself.  So, given that the screen of the iPhone 6 looks so much better than the iPhone 5s, I wondered what was different and ran into the lab for a quick capture of the iPhone 6 screen to see if any of the pixels had changed in size over the last little while.

With each iteration of the iPhone, Apple improves the screens to make them look substantially better, but what surprised me was that the resolution of the Retina Display, at least for the iPhone 4s, 5s, 5c and 6 is identical.  I had previously measured and calculated the pixel size of the retina display and for the standard iPhone 6, the pixel size is ~326 PPI. The same exact size…

So, why does this display look so good?  It turns out that what is different, like the iPhone 5 vs. the iPhone 4 is the proximity of the pixels to the glass in the iPhone 6 compared with the iPhone 5.  With the iPhone 6, the pixels appear to be almost one with the glass.  When the iPhone 5 came out, Apple bonded the display to the glass in an effort to get the pixels closer to the surface and Apple has appeared to make the pixels in the 6 even closer still.  Some of what we are seeing with the iPhone 6 may be a polarizing filter underneath the glass, but even so, the glass appears thinner and required less focus distance adjustment to get from the surface of the glass to the pixel on another microscope.  I don’t know what that precise distance is in microns between the surface of the glass and the pixels, but it was a shorter distance as judged by rotation of the focus knob in the iPhone 6 vs. the iPhone 5.  What this accomplishes is making the display appear to be higher resolution.  The blacks are blacker, contrast is higher and colors are more vibrant, even with the same OS.

Each display was imaged with the same settings on a stereomicroscope and a Canon 1D Mk III camera was used for imaging.  Magnification was held constant in all captures.  I used a Zeiss microscope with Zeiss 25x Plan APO lens to qualitatively measure the distance by focus knob rotation from the surface of the glass to the pixels.




I did find it interesting that Apple went with a different geometry to the subpixels in the iPhone 6.  This is a geometry that has been used for Apple Cinema Displays going back to at least 2007 as well as on the iPad a couple years ago.  What this does in terms of image appearance, I am not sure.  It could simply be different manufacturing approaches or there may be a psychophysical difference between sub pixel geometries… I don’t know, but would love to hear from an LCD engineer on their thoughts here.


iPhone5ciPhone 5c

Interestingly, the iPhone 5 series of displays appear to have more texture in the sub pixels than on the iPhone 4 Retina Displays.





The image at the top was made with the following camera/settings:

Camera: Canon Powershot S100
Exposure: 1/8
Aperture: f/6
Focal Length: 8.31
ISO: 1,600

Categories: Gear.

Tags: , , , , ,

Comment Feed

29 Responses

  1. Hi Bryan. When you take a microscope picture of the screens like this, what’s displayed on the screen? Is it all white or do you use a darker shade?

    • Hey Ruben,

      I try and get a brighter area for the images, pulling up say, the Mail app to give a nice consistent, white background.

      • Another question: On the photo of the 5c, the left part looks a lot more fuzzy and a bit darker than the rest of the image. Is that just something on top of the glass or is that something in the pixels?

        • The phone was probably not quite completely perpendicular to the imaging plane. Nothing wrong with the pixels… Just operator error putting the phone on the microscope stage.

  2. fascinating. you keep coming up with random, cool stuff to analyze. thanks for this.

  3. A couple of my microscopes have marks on the fine focus so that you can measure depth. One is marked in tenth millimeters, and the other in hundredths. One has a vernier to allow an additional ten times accuracy.

    Did you check for that? Stereo scopes don’t always have them.

    • Yeah, the stereo microscope I have does not have these hash marks on the focus knob. I could mark them, and then do a complete calibration curve, but time is kinda tight right now.

  4. The chevron subpixel geometry is the “dual domain pixels” that Schiller talked about.

    Brandon ChesterNovember 24, 2014 @ 11:24 amReply
    • Interesting… From an academic perspective, this is the more relevant link:

      • The “multiple domain” principle’s more general and quite a bit older than that paper, however. Multidomain vertical alignment (MVA) and patterned vertical array (PVA) displays are based on the same principle of having many (usually two) viewing cones pointing out of the screen, and they’re quite old. (I bought a not-exactly-top-of-the-line TV with a super-PVA panel in 2008. To my considerable frustration it broke in the winter of 2009, right after I’d settled in for a Christmas break spent playing New Super Mario Bros for the Wii. But I digress)

        In that case it was a way of improving the limited viewing angles of those technologies. Taking that principle and applying it to a screen technology that already has excellent viewing angles is a little mind-boggling.

  5. What other Science and Photography apps on your iphone.

    Next you can do the iMac Retina display screen.
    It is even more spectacular.
    Holy grail of screens will be when we can get rid of TFT and just go for Quantum Dots controlled by MEMS.
    get rid of the polarizer. then again current mobile screen is only 6 bit of color.

  6. The chevron shaped subpixels increase the viewing cone because the chevron halves have different optimal viewing angles that get averaged.

    But unrelated to that, you seem to imply that just a few microns less between the pixels and glass surface cause a noticeable increase in contrast and black levels. How does that make sense? A few microns of glass would have a negligible affect on light transmission.

    Wouldn’t it make more sense to attribute other aspects to the increase in contrast, for example the lower amount of black surface area between pixels allows a higher percentage of the display surface to illuminate, etc.

  7. “When the iPhone 5 came out, Apple bonded the display to the glass in an effort to get the pixels closer to the surface”

    It was the iPhone 4 that first had the LCD laminated to the front glass. Here’s a video of Big Bob Mansfield talking about it in the iPhone 4 introduction video:

  8. The iPhone 6 display looks better when wearing polarized sunglasses than the iPhone 5. I suspect the chevron pixels have something to do with it.

    • Hans: It’s a change they made to the polariser in the display. (LCD screens depend on polarised light to operate.) Apple patents attempting to solve the problem started appearing about four years ago; it looks like the handset winds up emitting circularly-polarised rather than plane-polarised light, so you don’t get the usual issues with the screen brightness varying with the phone angle.

      (You won’t find much about it in the tech press, because of a widely-spread misreading of the web site that thought they change was related to daylight readability. There are several amusing photographs attempting to demonstrate a change that doesn’t actually exist!)

  9. Excellent analysis. I upgraded from an iPhone 4 to an iPhone 6 and was really surprised at how much better the display looked, knowing that the PPI wasn’t actually any higher.

    I use a photo of the Earth in space as my lock screen – with the much deeper blacks and the reduced gap between the pixels and the glass, it looks like the Earth is just sitting there on the surface of the display. It’s really an incredible effect, and a significant improvement over previous iPhone models.

  10. We have a measuring microscope at work that also includes thickness measurement. After work today, I put the two iDevices I had available (iPhone 6 and iPod touch 4th Gen) to the test and gathered the following measurements (focused first on the pixels, then on the top of the cover glass):

    1,3 mm iPod 4th Gen.
    0,8 mm iPhone 6

    Note that these are not absolute distances, but the optical path lengths. As I now neither the refractive index of the cover glass nor its thickness, there’s no way of knowing what the actual distance is.

  11. While I was at it, I also saved some pictures of both pixel structures:

    There seems to be some substructure to the chevron pixels in the iPhone 6.

Some HTML is OK

or, reply to this post via trackback.

Continuing the Discussion

  1. […] verbeteren. Niets softwarematigs, maar direct verwerkt in het scherm. Retinale neurowetenschapper Bryan Jones nam de iPhone 6 onder de loep en ontdekte welke drie trucs Apple toepast voor prachtig beeld. […]

  2. […] Bryan Jones on the iPhone 6 (via John Gruber): […]

  3. […] The first time you see an iPhone 6 or 6+ display, it looks considerably better than the previous generation iPhone (5S) despite having the same resolution. Bryan Jones, a photographer, explains it like this: […]

  4. […] iMore ו- Bryan Jones עריכה, תרגום ותוספות: […]

  5. […] the Apple Watch pixels look very different from the iPhone pixels.  This may be because the Apple Watch display is anAMOLED screen, I don’t know.  But what […]