Disabling Oculus tracking/rotation
Hey, I'm aware that this question has been asked quite a lot (https://forums.oculus.com/community/discussion/40288/is-it-possible-to-disable-all-tracking, https://forums.oculus.com/community/discussion/comment/242286), but everything i've found has been for Unity, when I'm working with the C++ API (Using CV1, newest SDK). Anyway, I'd ideally like some function (like an ovr_DisableTracking), that i could call on initialization that would stop the Oculus from responding to any gyroscopic, accelerometer, etc, readings. Basically i want to turn the Oculus into a head mounted dumb screen. I've tried a separate thread that calls the ovr_SubmitFrame as frequently as possible, but the occasional frame drop or context switch leads to this making the image occasionally get out of sync with the actual HMD orientation, leading to black bars. Plus I'd prefer to drop an unnecessary thread. The reason i want this is because i want to have a camera mounted directly on the front of the Oculus piping into the screen, making any Oculus orientation readings redundant, since the camera moves with the user's head anyway. Any recommendations or code samples would be fantastic. Thanks.1.1KViews0likes2CommentsReducing delays: AR use case
I've seen number of posts related to using the Rift for Augmented Reality projects, and I have an issue in this area as well. I'm concerned about the end-to-end latency: from the moment a "photon" hits a video camera until it is presented to the user in the Rift. In my measurements I see delays of 100-120 msec, which is way longer than acceptable. When I'm presenting same camera image on the screen, I measure ~50 msec delay (this makes sense for a time it takes to grab an image + transfer it to PC over USB + process it), but I'm trying to reduce the extra 50-70 milliseconds which are being added by Oculus. Any ideas? Is there a way to reduce the length of the Swap Chain created by "ovr_CreateTextureSwapChainDX" to 1 instead of the default 3? Maybe it can help here.1.2KViews0likes4CommentsHow to implement correct aspect ratio camera passthrough in Oculus
Hi, I am trying to implement camera passthrough, I need passthrough to be as close to our natural eye as possible. I know I need webcams that have about same FOV as Oculus, ~100 degree. And the webcams should be as close to human eye as possible. What confuses me is at how far in the VR world should the captured webcam be rendered? If I render too far, passthrough is just a small rectangle, if too close, entire FOV cannot be seen. So how to determine the right distance to render to match FOV of webcam and Oculus, so that passthrough shows as naturally as possible to a natural eye? Is doing a calibration the only solution? Can anyone point to how might a calibration be? Thank you.1.1KViews0likes2Comments