❝With our new technology, a resolution density of 1,500 pixels per inch could be achieved on the same sized screen. This is especially attractive for virtual reality (VR) headsets or augmented reality technology, which must achieve high resolution in a small screen to look sharp when placed close to our eyes.❞
That's been making the rounds on VR communities everywhere, but what everybody fails to take into account is that turning the usual RGB subpixels into discrete pixels that can only display one color at a time results in what is effectively temporal chromatic aberration during movement.
Ever looked through a Gear VR before? Chromatic aberration at the edge of the lenses is awful and distracting. Now imagine that covering the whole screen every time you move your head, in much the same way you get black smear during certain dark scenes. NOT GOOD.
If you've ever waved your hand in front of a single-chip DLP display that needs a color wheel to display the full range of colors, you know exactly what I'm talking about. This is bad for VR. It also means that the response time advantage of those blue-phase LCD pixels is mitigated to a third of what it could effectively be because they all have to cycle through red-green-blue.
Now, the good news with this is that if they're serious about the response time improvement, then just preserving the dedicated RGB subpixel layout may be enough to make LCD viable for VR from a low-persistence standpoint and result in much improved gaming monitors.
@NamelessPFG, I wonder if one could have adjacent pixels 1/3 out of phase with the next to minimize that. ie., When 0,0 is showing its red component, 1,0 is showing its green component, and 2,0 is showing its blue component, and so on. That could minimize the problem. The two problems I can see with that is 1) thin objects of one color (and single pixel points in a field of another color) will still visibly have aberration, 2) the added circuitry to allow this may make panels less feasible (and more prone to having partially or fully dead pixels), at least with the initial panel circuit design that came to mind (maybe another circuit switching method that hasn't come to mind might work better—just ad hoc pondering as I write with a little wine in me at the moment 😛 ). 😕
@NamelessPFG, I wonder if one could have adjacent pixels 1/3 out of phase with the next to minimize that. ie., When 0,0 is showing its red component, 1,0 is showing its green component, and 2,0 is showing its blue component, and so on. That could minimize the problem. The two problems I can see with that is 1) thin objects of one color (and single pixel points in a field of another color) will still visibly have aberration, 2) the added circuitry to allow this may make panels less feasible (and more prone to having partially or fully dead pixels), at least with the initial panel circuit design that came to mind (maybe another circuit switching method that hasn't come to mind might work better—just ad hoc pondering as I write with a little wine in me at the moment 😛 ). 😕
@NamelessPFG, I wonder if one could have adjacent pixels 1/3 out of phase with the next to minimize that. ie., When 0,0 is showing its red component, 1,0 is showing its green component, and 2,0 is showing its blue component, and so on. That could minimize the problem. The two problems I can see with that is 1) thin objects of one color (and single pixel points in a field of another color) will still visibly have aberration, 2) the added circuitry to allow this may make panels less feasible (and more prone to having partially or fully dead pixels), at least with the initial panel circuit design that came to mind (maybe another circuit switching method that hasn't come to mind might work better—just ad hoc pondering as I write with a little wine in me at the moment 😛 ). 😕