Object Tracking with Oculus SDK
Hi, I asked myself if it would be possible to track more objects than the headset and the oculus controllers with the SDK. I found, that both have integrated infrared LEDs with a certain blinking pattern, that is then tracked by the cameras. Now my question is: Is it possible to add some LEDs with other blinking patterns to different devices like e.g. a tablet and kind of teach the cameras those patterns to enable object tracking and displaying it in a Unity scene? And further, is there some possibility to do this via the Oculus SDK or is it more like a black box and has to be done via a video stream and computing or the like? Maybe somebody has an idea and could help me! Anyway, thank you in advance! :smile:1.4KViews3likes2CommentsWhat is the unit of HeadPose.ThePose.Position ?
Hey, I have the tracking data from HeadPose.ThePose.Position and calculated the 3D distance with xyz. Then I summed it up to get the distance a user made in a fix time interval. Can you tell me, which unit I will get? It does not seem to be mm, but is it cm or meters? Thank you372Views0likes0CommentsWhere's the headset and controller origin point?
What position, specifically, on the physical headset and touch controllers is reported by ovr_GetTrackingState? From the doc: "The SDK reports a rough model of the user’s head in space based on a set of points and vectors. The model is defined around an origin point, which should be centered approximately at the pivot point of the user’s head and neck when they are sitting up in a comfortable position in front of the camera." However, the eye offsets provided by ovr_GetRenderDesc only have a nonzero offset in the x-direction, which doesn't jive with the above description. What physical points do ovrTrackingState.HeadPose and ovrTrackingState.HandPoses actually correspond to on the devices?2.6KViews1like3CommentsTracking the Rift's position and orientation about a fixed origin (relative to the camera)
I'm working on a project (in Unity 5) which requires absolute position and orientation tracking relative to the camera. This is in combination with Leap VR hand tracking and some physical equipment (fixed relative to the camera) which the user interacts with while in VR. The runtime clearly knows the information I need - is there any way of extracting the transformation from camera-tracking space to the tracking space presented by the runtime? What I've got at the moment is a kind of reverse-calibration which 'corrects' the tracking transformation so that the camera position and orientation is fixed in the scene, but that seems archaic and doesn't work properly when the camera is pitched up or down.2.4KViews0likes2CommentsIs there a way of accessing camera tracking data on the CV1?
Sorry if this has been asked before. I searched, but didn't quite find the answer I needed. Okay, so, I made a decal that covers the entire front of my Rift, and the material is definitely translucent enough that at least *some* of the constellation IR is getting through, as I'm still being tracked with it on. I did, however, notice myself getting a little dizzy in an app I don't normally notice any problems with (I don't remember which, now), but didn't really see any visible problems. I took the decal back off, and I *think* the problems went away (if they were even there to begin with). I can test whether the camera can still see the front constellation LEDs through the decal by just putting my hands in front of the front of the Rift. Hands in front: no positional tracking (rotational-only). No hands in front, even with the decal: positional/rotational tracking. Anyway, I'd like to find some way of quantifying tracking data for testing. Brightness of the constellation LEDs, whether the camera is losing tracking on any of them (or the lower brightness is causing dropped tracking frames), etc. Any advice?574Views0likes0Comments