Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Teddy0k's avatar
Teddy0k
Explorer
13 years ago

Idea - How to get fast head tracking with any frame rate

Here's an idea of how to get low latency head tracking with almost any variable frame rate.

Render the scene from the player's point of view with no distortion, but to a larger render target with a wider FOV. The increased FOV should be large enough to take into account how fast a user might turn their head in one frame (I figure ~8 degrees?);



Then at a faster update rate (60 or 120 times a second?) sample how much the HMD orientation has changed since the last full render and calculate how far you'd need to pan and rotate the image.



You can then sample from the render in the highlighted frame and run the distortion effect on this and send this to the Oculus's screen. This would would eliminate any variability in the update rate of the head tracking for the user and would not require games to run at a stable 60 fps to have a smooth experience. Note: The player's movement and the simulation of the world would still update at the normal frame rate.

Ideally this whole process could run in hardware on the Oculus Rift itself. Doing so would require the hardware be able to apply the distortion and to accept a render target larger than the screen itself, which would need to be aware of the orientation it was rendered to.

Alternatively, doing this on the GPU is quite tricky. Running parallel updates alongside the normal scene render is not possible in any GPUs that I'm aware of, it would require a separate render pipeline. One other application of this would be to simply apply this technique at the distortion step of the current frame. This would reduce the amount of latency in the update by a little less than 1 frame (~ 16ms-30ms).

Is my thinking correct here? Any see any reasons why this couldn't work?

22 Replies