Forum Discussion
deftware
8 years agoExpert Protege
ovrLayerFlag_TextureOriginAtBottomLeft causing funky distortion of eye textures?
Working with the PC SDK, I managed to get everything to look fine if I don't use the ovrLayerFlag_TextureOriginAtBottomLeft flag with the ovrLayerEyeFov after inverting everything in my API test/practice app on the Y axis, worked perfectly.. But now I'm trying to put VR rendering into my existing engine and I can't go in and change the Y axis on everything and all the shaders (easily) so I would like to use this flag but it's causing really funky distortion of the eye textures when looking around. The OculusTinyRoom(GL) sample doesn't seem to be doing anything in particular to prevent this strange warping.
It has something to do with how the compositor is drawing the eye textures according to the RenderPose I'm setting for the EyeFov layer, but the pose works fine when I don't use the bottom-left texture origin flag - as long as I also flip everything about the rendering upside down.. Then there's absolutely no distortion/warping from the perspective.
I can't find anything anywhere about the flag requiring a differently situated pose to be passed into the EyeFov layer, but it seems that somehow this is a big deal. Everything is lining up fine otherwise - wherever I look the pose that I am passing in has the compositor drawing the eye textures right there in my view where they should be, but they're distorted wrong for the perspective, which isn't the case when I don't use the origin-at-bottom-left flag.
Help!
UPDATE: I did an experiment and removed the flag from the layer, and instead blitted the rendered scene upside-down to the swapchain textures, and the problem persists. It looks like it's just a matter of the rendered perspective view texture being flipped upside-down in the first place, which means the flag offered for OpenGL is useless? The only recourse is to literally flip everything upside down in my engine so that the rendered perspective is actually upside down to match. Is there any other way? Am I missing something with flipping my projection upside down or something?
UPDATE2: AHA! It was the code I ripped from the ovrMatrix4f code for creating a projection matrix from the ovrFovPort's tan() based definition, I just had to swap the top/bottom values to generate an inverted Y offset. So the projection that was being used to render the world is what was actually incorrect. That was a 3-day adventure trying to figure that one out.. Well hopefully this will help someone in the future trying to build their own OVRlib based VR renderer.
No RepliesBe the first to reply
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 3 months ago
- 11 months ago
- 10 months ago
- 6 months ago
- 11 months ago