Forum Discussion
skynet
11 years agoHonored Guest
0.4.3 ovrHmd_GetEyePoses usage
In SDK 0.4.3 ovrHmd_GetEyePose() got replaced by ovrHmd_GetEyePoses().
I like the way it now includes the necessary IPD translation. This was not very obvious in the previous version and is now handled automatically.
What I don't understand, though, is why the 0.4.3 API now suggests retrieving both eye poses at the same time.
0.4.2 encouraged to call ovrHmd_GetEyePose() seperately for each eye, right before you start culling/drawing each eye in order to get the eye pose as close to the point in time per-eye rendering as possible.
ovrHmd_GetEyePoses() suggests retrieving both poses at the same time, probably at the start of each frame. This moves the second eye pose 'further away' from the optimal pose you could use.
Is this intended? What are the recommendations for when to call ovrHmd_GetEyePoses() and use ist results?
I like the way it now includes the necessary IPD translation. This was not very obvious in the previous version and is now handled automatically.
What I don't understand, though, is why the 0.4.3 API now suggests retrieving both eye poses at the same time.
0.4.2 encouraged to call ovrHmd_GetEyePose() seperately for each eye, right before you start culling/drawing each eye in order to get the eye pose as close to the point in time per-eye rendering as possible.
ovrHmd_GetEyePoses() suggests retrieving both poses at the same time, probably at the start of each frame. This moves the second eye pose 'further away' from the optimal pose you could use.
Is this intended? What are the recommendations for when to call ovrHmd_GetEyePoses() and use ist results?
13 Replies
- rjoyceHonored GuestovrHmd_GetEyePose() was actually replaced with ovrHmd_GetHmdPosePerEye()
Where do they suggest retrieving both eyes at the same time?
When they originally made the change to get poses per eye, I didn't understand why you would use different head poses for each eye, as it seems to me that could cause discomfort, but I trust Oculus looked into this. - skynetHonored Guest
OVR_EXPORT void ovrHmd_GetEyePoses(ovrHmd hmd, unsigned int frameIndex, ovrVector3f hmdToEyeViewOffset[2], ovrPosef outEyePoses[2], ovrTrackingState* outHmdTrackingState);
outEyePoses[] will output the two eyeposes at a time.
The comment above ovrHmd_GetHmdPosePerEye() clearly states/// DEPRECATED: Prefer using ovrHmd_GetEyePoses instead
/// Function was previously called ovrHmd_GetEyePose
which suggests that ovrHmd_GetEyePoses() is the way to go.I didn't understand why you would use different head poses for each eye
The closer the time of the sensor data is to the time when the present happens, the less correction timewarp needs to be doing (i.e. it should be more accurate). - rjoyceHonored Guest
"skynet" wrote:
The comment above ovrHmd_GetHmdPosePerEye() clearly states/// DEPRECATED: Prefer using ovrHmd_GetEyePoses instead
/// Function was previously called ovrHmd_GetEyePose
which suggests that ovrHmd_GetEyePoses() is the way to go.
Wow, they deprecated and renamed it at the same time. Marvelous. (oh and no mention in the docs as the 0.4.3 dev guide still lists GetEyePose in the sample code)"skynet" wrote:
The closer the time of the sensor data is to the time when the present happens, the less correction timewarp needs to be doing (i.e. it should be more accurate).
Oh that makes sense -- they don't actually render different poses in each eye, but rather the 2nd eye rendered has less shader/timewarp distortion due to being closer to the present.
I'm guessing internal testing indicated that it was added complexity to the API for almost zero gain, but with nothing more than this comment to go off of, who knows.
But on that note -- then why bother with outEyePoses being an array of 2? If we poll for it at one time in the frame there is only one best guess for where the head will be at render time. - FredzExplorerThis new call has been talked about in this Oculus Connect talk : http://static.oculus.com/connect/slides/OculusConnect_Mastering_the_SDK.pdf
Video : https://www.youtube.com/watch?v=PoqV112Pwrs#t=2518 - skynetHonored GuestAha!
Important: Time warp doesn’t care about when and via what SDK call renderPose[2] was generated
Thats exactly the missing piece of imformation. - rjoyceHonored GuestOh, makes sense. Timewarp doesn't need to know when you polled for the pose because it will just say "ok you were here when you rendered, but now I just polled a better new pose that says we should be here, so I'll just work my magic to distort to here from there".
Time management is only for dynamic prediction -- i.e. how far ahead will a call to geteyeposes predict?
I think the best approach would be to:
1. ovrHmd_beginFrame()
2. poll for head position, use unionized camera for proper culling (see the video/slides for what I mean by that)
3. poll again, draw scene from each eye
4. ovrHmd_endFrame
But would there be much difference if 3 was expanded to:
3a. poll for left eye, draw left eye
3b. poll for right eye, draw right eye
That is something I didn't see touched on in the talk or in any of their docs. - jhericoAdventurer
"rjoyce" wrote:
I think the best approach would be to:
1. ovrHmd_beginFrame()
2. poll for head position, use unionized camera for proper culling (see the video/slides for what I mean by that)
3. poll again, draw scene from each eye
4. ovrHmd_endFrame
Actually, the docs for GetEyePoses() explicitly call out that it is thread safe, which means it doesn't have the same requirement as GetEyePose() of being called between BeginFrame() and EndFrame(). This means that GetEyePoses() is much better suited to asynchronous rendering, where you put your frame generation on one thread and distortion and display to the rift on another thread.
Either they're gearing up to make SDK built-in async timewarp a reality, or they're trying to make it easier for client applications to use it. - rjoyceHonored Guest
"jherico" wrote:
... where you put your frame generation on one thread and distortion and display to the rift on another thread.
Is this something that is possible/advantageous in OpenGL? I'm not very experienced in OpenGL, but I always thought only one thread can talk to the GPU, so they would have to wait for each other anyway.
I thought the reason they made it thread safe was so you could update your logic and/or cull on one thread (which a lot of engines do) and then frame generate on your render thread. - jhericoAdventurer
"rjoyce" wrote:
I thought the reason they made it thread safe was so you could update your logic and/or cull on one thread (which a lot of engines do) and then frame generate on your render thread.
No, if you're just interested in the head pose for that kind of thing you've always been able to call GetTrackingState() on any thread at any time. It only returns a single pose and doesn't deal with things like per-eye-offsets. GetEyePoses() is specifically for rendering."rjoyce" wrote:
Is this something that is possible/advantageous in OpenGL? I'm not very experienced in OpenGL, but I always thought only one thread can talk to the GPU, so they would have to wait for each other anyway.
Using OpenGL in a multithreaded fashion has some limitations, but it's certainly possible and advantageous in this circumstance. Basically an OpenGL context can only be used on one thread at a time, but you can have multiple OpenGL contexts (after all the driver might have multiple OpenGL programs running against it) and more importantly, you can have different contexts that share information.
If you think about it, the rendering thread only has one job, to render the scenes to offscreen textures, for which it can use framebuffers. Likewise, the distortion has only one job, to take textures containing the rendered scenes and distort them before putting on the physical display. The only point of contact between the two threads is the texture ID you rendered to and the eye pose you used to render it, and OpenGL only cares about the first part.
I have a whole video on it: - rjoyceHonored GuestThanks jherico. Awesome explanation as always. Great video too.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 month ago
- 3 years ago
- 9 months ago