cancel
Showing results for 
Search instead for 
Did you mean: 

Antialiasing/Scene Rendering Question

Halopend
Explorer
Ok, so first of all let me say I'm not a developer so if this is obvious or really misguided (aka WRONG). Don't bite my head off. I feel like I'm about to sound like one of those crazy people I see on the forum all the time who just have no idea what they are talking about (probably because I have no idea what I'm about to talk about).

With the distortion filter the rift uses is antialiasing applied before or after the distortion and could there be adjustments to the techniques used to get crisper pictures or at least better performance?

Actually, could could a scene be rendered directly through the distortion filter? The current (and much easier way) to render a scene is by rendering at a higher resolution than the actual screen of the rift and applying the distortion filter to that to keep the "stretched" portions of the image crisp. The problem with this is there is a ton of rendering wasted on the peripheral as the distortion filter "throws out" a good portion of what's rendered there.

Would it be possible to tweak the rendering engine to just render the scene directly through the distortion so you don't waste processing time rendering details in the scene that just get thrown out or is it just too fundamental of shift?

As a side note, does the subpixel arrangement of the rifts screen have an effect on antialiasing?
2 REPLIES 2

volgaksoy
Expert Protege
Rendering the scene directly into a distorted (or fisheye lens) style frame is not possible without exposing other issues due to how real-time rendering fundamentally works for GPUs. If you have a projection matrix that applies what is called a curvilinear projection, then you'll run into edge cracks due to mismatches on low vs high res models. You can think of a simple case where you have a straight edge that becomes curved at each vertex. If you have two separate models that share the same edge and the edge has more vertices on model than the other, when curved at each vertex, your straight edge that has been curved will no longer overlaps for each model sharing the edge. Similarly, because you are distorting the model at each vertex, the triangle resolution of the model will determine how smooth the curved edges will look. A gigantic quad that is rendered with only two triangles for example will not be able to curve at all.

As for better scene antialiasing, we will introduce a high-quality distortion option into our OWD sample. This does not address the wasted periphery issue you mentioned, but it will make the distortion results look even better when combined with regular 4xMSAA.

As for avoiding wasted periphery rendering, you can actually utilize some advanced rendering tricks like using a geometry shader that renders the geo to 5 separate planes (flat + 4 slanted up, down, left, right, or just 3 where you render into the "corner of a box"). As far as wasted frame buffer space goes, this would allow you to render FOVs even higher than 160 fairly efficiently. That said, geometry shaders are not cheap so there a good chance that using this on all geo rendered in a VR scene could waste a lot of cycles. So all in all, it'd be a balancing act between wasted periphery rendering cost vs. vertex processing cost in the geometry shader.

Hope that helps answer some of your questions.

Halopend
Explorer
It does definitely answer (though I'll admit I'm only understanding it at a basic level).

If you REALLY wanted to to push the boundaries of efficiency, you could probably get away with only antialiasing the inner 2/3rds of the image and just consider the peripheral as having "SSAA". I need my FPS dammit!!