Forum Discussion
davidwyandgg
13 years agoExplorer
Correcting for Chromatic Aberration
Greetings Rift Peeps!
As you move your eye beyond the center of the Rift's lens you can see the telltale fringe of chromatic aberration. It actually starts to appear a lot sooner than I would have thought or like.
Is there anything that can be done to correct for the distortion? Some sort of colour channel scaling performed before, during, or after the barrel distortion? Or is this just something we'll have to live with?
Thanks!
- Dave
As you move your eye beyond the center of the Rift's lens you can see the telltale fringe of chromatic aberration. It actually starts to appear a lot sooner than I would have thought or like.
Is there anything that can be done to correct for the distortion? Some sort of colour channel scaling performed before, during, or after the barrel distortion? Or is this just something we'll have to live with?
Thanks!
- Dave
17 Replies
- cyberealityGrand ChampionYeah, this is totally something that can be compensated for, and it's an area of research for us at Oculus.
I can't really say if we'll have a solution to this, so feel free to investigate your own solutions if need be. - palmertechExplorerLike cybereality says, it is something we are researching, and would like to integrate into the SDK. Valve has experimental chromatic aberration correction in the Team Fortress 2 VR Mode.
- KBKProtegeit may also shift according to the rate of motion. Ie, whether the given pixel should be R,G or B, dominant and what level it should take on, for the given time it will be part of a 'frame'.
- edziebaHonored GuestThe pixels and lenses themselves are stationary w.r.t. each other, so the aberration should not vary.
- IGameArtProtegeHmm it seems to me like you should be able to shift the colors in the periphery in the opposing direction so when the abberation occurs it shifts the colors back into the right positions. This could be what we've been seeing in the tf2 screenshots.
- davidwyandggExplorerThat is the first approach that I will try when I have some time. I was considering making use of the distortion coefficients/equation to help determine the amount of colour shift.
- Dave - RobertHonored GuestTo understand this problem better it would be great if Oculus would publish the schematics of the Rift's optical path as well as the properties of the glass that's being used for the lenses. Does anyone know if the Rift uses an achromat for the main lens? Or would that be too expensive for a wide field of view?
- KBKProtege
"edzieba" wrote:
The pixels and lenses themselves are stationary w.r.t. each other, so the aberration should not vary.
I was speaking with respect to persistence of vision vs pixel draw/hold rise and fall time.
The situation contains multiple factors. Rate of turn, frame rate, pixel rise/fall time, algorithms and numerical transforms in use and order of stacking-and then overall effectiveness of that stacked set in-situ, etc.
The complete list of individual parameters that play into the perception and resolution of this situation probably number in the high teens, as an experienced guess.
Of course, there will be one or maybe as high as three components that will carry the bulk of the blame and then be the bulk of the 'fix', but the others still play a part and should never be dismissed (or left unrecognized), as they will play more and more prominently as the bigger issues are tackled. - geekmasterProtege
"davidwyandgg" wrote:
Greetings Rift Peeps!
As you move your eye beyond the center of the Rift's lens you can see the telltale fringe of chromatic aberration. It actually starts to appear a lot sooner than I would have thought or like.
Is there anything that can be done to correct for the distortion? Some sort of colour channel scaling performed before, during, or after the barrel distortion? Or is this just something we'll have to live with?
Thanks!
- Dave
The "correct" way to perform chromatic aberration correction is to apply the classic "Brown's model" distortion correction forumula to each color channel independently, using slightly different distortion coefficients for each color channel.
The distortion correction formula used in the OculusVR SDK supports the K1 and K2 coefficients for radial distortion correction, but it omits the P-coefficients needed for tangential distortion correction, which becomes much more important as the lens offset increases, caused by the difference between viewer IPD and Rift IPD (64mm).
To compensate for the viewing position NOT through the optical axis of the lens, it is important to add tangential correction (P1, P2, and P3 coefficients) to the distortion correction pre-warp used for the Rift Dev Kit. According to the literature, for adequate image quality this distortion correction should be applied to a 2x image, using different distortion coefficients for each color channel. Then the results are downsampled to half size, and cropped (with matching IPD offset) to form the Rift left and right images.
Remember that tangential correction is asymmetrical, so each eye will need to be processed differently (perhaps using tangential distortion coefficients that have been mirrored on the vertical axis). Also remember that vertical offset (which some viewers may use to shift their view of the available vertical image) will change the offset angle needed for tangential correction.
There are alternative models for lens distortion (e.g. Zernike model), but Brown's model seems to be the most popular.
References:
WikiPedia: Distortion (optics)
Centi-pixel accurate real-time inverse distortion correction
Decentering Distortion of Lenses
Lens Distortion for Close Range Photogrammetry
Correction of Radially Asymmetric Lens Distortion with a Closed form Solution and Inverse Function
Algorithms for coplanar camera calibration - jdewittHonored GuestNice post, geekmaster. I couldn't have said it better myself.. very thorough!
I've run into this problem with a DIY stereoscopic viewer. For reference, this is what it looks looks like in my DIY thing: https://dl.dropboxusercontent.com/u/2709615/hmd/chromatic_aberration.jpg.
I've messed with OpenCV's distortion functions (which use Brown's model) to try and compensate for just radial distortion. I've had mixed results. The main problem I've run into is low pixel density making it impossible to do fine enough correction. For example, if you want to shift the red component of a given pixel horizontally 0.3 pixels, there's a green component there (or blue), not a red one. The result is that the integer shifts that are possible with the usual subpixel arrangement of LCD panels ends up not being small enough. The problem is similar with vertical offsets, but perhaps even worse as subpixel elements are usually packed horizontally.
I imagine it's still worth doing correction with the devkit, but I imagine we'll get much smoother (better) results with a higher effective pixel density panel (lens plays a role in effective pixel density), as it allows for finer distortion correction.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 13 years ago
- 12 years ago