Forum Discussion
firagabird
8 years agoAdventurer
How does the Gear VR runtime handle lens distortion & chromatic aberration?
From a high level perspective, I know that scene geometry is first rendered to an eye buffer, which a VR runtime then applies lens distortion & chromatic aberration correction. How does Gear VR specifically apply this process, though? Also, what optimizations are in use (or planned) to take advantage of the tiled nature of mobile GPUs?
4 Replies
- firagabirdAdventurerBrian Kehrer at Google discussed how Cardboard (& Daydream, I presume) makes use of a technique called "vertex displacement" to handle the distortion correction (not sure about chromatic aberration, though): https://www.gamasutra.com/blogs/BrianKehrer/20160125/264161/VR_Distortion_Correction_using_Vertex_Displacement.php
It's frankly a brilliant innovation that has incredibly low overhead and is perfectly suited for mobile GPUs; since everything's done in the shader, the eye buffers wouldn't need to be rendered to a texture, thus saving a very costly trip to the GMEM. - deftwareExpert ProtegeThe distortion correction I assumed was a simple function that calculates where to sample the eye textures in order to achieve the distortion correction.
Chromatic aberration correction can be done one of two ways: low-quality/cheap, high-quality/expensive.
The cheap correction involves offsetting where the red and blue channels are being sampled from in the eye textures. The end result is that the 'corrected' image has distinct separation of the red/green/blue channels as opposed to a smooth spectrum of colors. Chromatic aberration spreads the light into a smooth spectrum of ROYGBIV, not just 3 discrete colors. This is also the exact same way that games/demoscene perform a fast and cheap chromatic aberration effect via post processing, but it's simply not accurate. The end result with the cheap correction is that you will still see some division of the colors through the lenses no matter how much the 'correction' is tweaked.
The correct way to generate or correct for chromatic aberration is by taking many samples along the vector that the colors spread (i.e. away/toward center of image/lens) around each pixel. This will more closely model the spectral continuum of hue that refraction imparts. Also, hello from reddit :)
https://reshade.me/forum/shader-presentation/1133-yaca-yet-another-chromatic-aberration - deftwareExpert ProtegeAlso, I am surprised if they are drawing a mesh of trangles and displacing vertices to correct for the pincushion distortion of the lenses. I assumed they were just warping the sample location in the fragment shader that renders a fullscreen quad to the actual display framebuffer. I can't imagine it's any slower than having a bunch of triangles to render. I guess taking advantage of the texturing hardware simply mapping the image across the triangle could possibly be faster than either loading a sample location from another texture or calculating it. Instead it would be implicit in the positions of the vertices. Reminds me of the old Winamp visualization plugins from the late 90s, drawing a fullscreen mesh of the last frame with vertices displaced according to a Fourier transform spectral analysis of the audio coming through.
EDIT: Ah, I see, they're distorting the actual vertices in world-space, not screenspace. That's clever! - deftwareExpert ProtegeI am curious as to when they'll simplify the lenses and use concave displays instead. We've been seeing flexible OLEDs at trade shows for a decade now, what gives ?
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 13 years ago
- 12 years ago
- 12 years ago