Forum Discussion
Anonymous
9 years agoNon asymetric frustrum and custom FOV - how to encode this properly ?
I want to implement in our code a custom FOV setting. The question is whatever this is possible. From the camera setup parameters reported by the drivers (sum or min/max arcus tangent), it is clear CV1 required aprox. 79.5 FOV, DK2 95 FOV. The SDK documentation shows how to visualize lower FOV by using only a part of the HMD FOV (see Oculus Home example for demo ...) . It means to show e.g. 50 FOV tunel in the 79.5 FOV. CV1 require in addition strongly asymmetric camera placement (the lenses are not in the displays centre).
The question is can I implement bigger FOV scene view than the drivers offered one ? It was working O.K. with DK2. If I now modify the asymmetric frustum matrix to both hold infinity to where should be and bigger FOV the scene looks aftet the modification during horizontal head movement warped and strange. I am not sure I do something wrong (I multiply the directional tangents by a constant to get bigger FOV) or whatever what I want is impossible due to the HW design. Any idea ?
10 Replies
- galopinHeroic ExplorerI did a test by increasing the FOV values for the eyes, doubling the angles, if my render follow, obviously, the distortion in the compositor still follow the "real FOV" setting and just crop everything outside the reported FOV…
- joanProtegeTrick the compositor by passing the unmodified FOV during submit? I would still expect timewarp to make a mess of it…
- cyberealityGrand ChampionWhat exactly are you trying to accomplish? This really just sounds like a bad idea to me.
- galopinHeroic ExplorerI imagine a valid "mess with fov" situation, for games and applications with binoculars.
Playing with fov is also a way to give speed feedback in racing games.
Right now like evirything else, oculus did a black box working for 90% of cases. But they need to open, first, because they don't have a 100% knowledge of what vr can achieve and they limit innovation doing so. And second because they limit innovation… - AnonymousWhat we want to accomplished: We develop an application for visualizing cube mapped stereoscopic 360 video. With DK2 it was possible to change the FOV and modify what part of the video the viewer see. If I apply the CV1 viewing frustum as defined by drivers, all works OK.
Following code works :
void OculusVR::SetProjection(int eyeIndex)
{
glMatrixMode(GL_PROJECTION);
ovrMatrix4f projection = ovrMatrix4f_Projection(m_hmdDesc.DefaultEyeFov[eyeIndex], 0.1f, 100.0f, ovrProjection_ClipRangeOpenGL);
glLoadTransposeMatrixf(projection.M[0]);
}
I now want wider fov:
left = m_hmdDesc.DefaultEyeFov[eyeIndex].LeftTan * clip_near (-1); e.t.c.for right,bottom,top .......
scale >1 ....
glFrustum(left*scale, right*scale, bottom*scale, top*scale, clip_near, clip_far);
The result is deformed view ... - galopinHeroic Exploreroh, a zoom :p i hate to be right :p
- Anonymous
galopin: yes, zoom. So you thing this can not be done due to :
1) improper processing by the "black box" part of the Rift code
2) Rift off-centre lenses and related scene deformation
3) I do something wrong ?
- thewhiteambitAdventurerYou simpy don't - because of a badly designed blackbox API. Of course you can render in any FOV, and you can say to OVR-API to assume a wider or narrow FOV. But the blackbox will mess with distortion. Best guess would be to give OVR-API a wider FOV and much to high resolution textures. render the zoomed picture with narrow FOV just to the center of that huge texture. But this is of course a mess to handle a bad API...
- jhericoAdventurer> left = m_hmdDesc.DefaultEyeFov[eyeIndex].LeftTan * clip_near (-1); e.t.c.for right,bottom,top .......
Why are you multiplying LeftTan by clip_near? If you want to increase the field of view, just construct an ovrFovPort of the FOV you want...
float up = PI / 3;
float down = PI / 3;
float left = PI / 3;
float right = PI / 3;
ovrFovPort myPort { tan(up), tan(down), tan(left), tan(right) };
> glFrustum(left*scale, right*scale, bottom*scale, top*scale, clip_near, clip_far);
Ugh... why are you using glFrustum?
ovrMatrix4f projection = ovrMatrix4f_Projection(myPort, clip_near, clip_far, ovrProjection_ClipRangeOpenGL);
Bam... 4x4 matrix you can just upload to GL (though I think you need to transpose it first). Now, you can get the appropriate texture size with ovr_GetFovTextureSize. However, if you want to display it on the actual HMD, you're going to need to figure out what the viewports into that larger than necessary texture should be by calling ovr_GetFovTextureSize for the actual correct FOV and then comparing the two sizes. That's left as an exercise for the reader. - Anonymous
jherico: You can check as an exercise your approach and my approach give identical projection matrix (by reading the matrix). Setting the glFrustum to the distances calculated by clipnear + directional tanget generates the same matrix as ovrMatrix4f_Projection (it is only a bit more intuitive and easy to understand).
The issue is: If you do not use the SDK FOV, the scene on CV1 does not look correct. All works on DK2 only. Did you try to set your custom FOV with CV1 ?
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device