Forum Discussion
guysherman
12 years agoHonored Guest
Why must the warping for the optics be done in a pixel shade
Ok, so please forgive the n00b question, I haven't had a chance to take a good look at what the pixel shader is doing, but I have seen the thread about raytracing, where the author has factored the warping into the rendering. Is it not possible to create a projection matrix which factors the warp into the rasterization process? You wouldn't need all the over-draw then.
5 Replies
- tlopesHonored GuestIn order to widen the field of view of the device, the Rift's lenses are curved (specifically, with a curve that imparts a pincushion distortion to images displayed on the screen). A pincushion distortion makes a straight grid look sort of like this:

If we were to display normal game images then they would too appear to be bent or curved wrongly. So in order to counteract the bend of the pincushion distortion, we need to implement a barrel warp distortion. Barrel distortions look like this:
When you combine a barrel distortion and an equivalent pincushion distortion, they should "cancel out" and you should see a flat straight image.
Now to get back to your original question:
Why do we need to apply the optics warp in the pixel shader?
Even if you apply a warp projection to all vertices to get them in the right place in a warped image (which is very possible), you still have the issue of the triangle edges being straight lines (which results in ultra-jagged edge artifacts when viewed through the Rift's optics). Essentially, the blocker for being able to skip the post-process optics warp is because modern video hardware cannot rasterize curved triangles - it can only rasterize straight triangles. There do exist alternative techniques, such as raytracing or high-degree polygon tessellation (possibly using tess. shaders to get you nearly-pixel-sized triangles) that can work around these limitations. - geekmasterProtege
"tlopes" wrote:
... Even if you apply a warp projection to all vertices to get them in the right place in a warped image (which is very possible), you still have the issue of the triangle edges being straight lines (which results in ultra-jagged edge artifacts when viewed through the Rift's optics). Essentially, the blocker for being able to skip the post-process optics warp is because modern video hardware cannot rasterize curved triangles - it can only rasterize straight triangles. There do exist alternative techniques, such as raytracing or high-degree polygon tessellation (possibly using tess. shaders to get you nearly-pixel-sized triangles) that can work around these limitations.
Thanks for this information. You said it in a clear and simple way that even I can understand! :D - guyshermanHonored GuestThanks, that's a very clear explanation, and I now understand why ray-tracing works, but standard GPU rasterisation doesn't.
- tlopesHonored Guest
"guysherman" wrote:
Thanks, that's a very clear explanation, and I now understand why ray-tracing works, but standard GPU rasterisation doesn't.
It's a real shame that it doesn't, because modern hardware rasterization is extremely fast. The problem is that modern GPUs are all built with so many internal optimizations that rely on the rasterization of straight (not curved) triangles, and so even if you were to implement GPU rasterization via the compute shader it would end up being very slow (this paper from NVidia in 2011 cites a 2x-8x slowdown from not using the hardware in the way it was meant to be used). Graphics card vendors could change this in the future and add efficient curved hardware rasterization. The benefits of this would be higher framerates for Rift games, as well as antialiasing and rendering to the native framebuffer size. However, that seems far off right now - perhaps it'll happen as VR rises in popularity. - jchernobieffHonored GuestI haven't tested tessellation on the Rift but I have implemented algorithms with similar issues such as a logarithmic z-buffer (to support huge draw distances with fine detail). Anyway if the edges are sufficiently small in screenspace the amount of error generated by linear interpolation should be small enough to be insignificant. So using on-demand/dynamic tessellation, based on screenspace edge length, should allow for proper geometric distortion rather then screenspace distortion. I suspect triangles can be several pixels in size (my intuition says around 4-8 pixels - though testing is required of course). Many objects in the scene would be fine as-is until viewed very closely (like character models) as well as distant objects.
In addition you can take advantage of the extra work by using the tessellation to support displacement maps and/or subdivision surfaces. There are several benefits to this approach:
* Post process distortion no longer needed, this should result in a clearer image and alleviate the need for supersampling. If supersampling is still used the result should be higher quality and more consistent AA. In other words, lower fill rate demands while generating a higher quality image (note this is offset by higher demands elsewhere, see below).
* This uses the standard rasterizer hardware and should, with scenes I tend to work with anyway, be much faster then using Compute to rasterize.
* Proper Anisotropic filtering throughout the image (i.e. no distortion).
* Higher quality surface geometry via subdivision surfaces/displacement mapping.
Of course there are also disadvantages:
* Potentially higher system requirements ( depending on the game/scenes and methods used )
* Higher geometry processing demands (whether this is a problem depends on the game/scenes)
* Higher demands related to triangle setup (rarely an issue but this depends on the game).
* More difficult to implement (could be MUCH more difficult if using an existing engine)
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 months ago
- 11 years ago