Forum Discussion
molton
11 years agoExplorer
How do I access sensor fusion data in 0.4.3?
I can't figure out how to use sensor fusion in the new SDK. ( GetOrientation(quaternion orientation) and GetPredictedOrientation(quaternion orientation) in 0.25 and GetCameraPositionOrientation(vector3 position, quaternion orientation) in 0.3.2 and 0.4.2)
Does anybody know the 0.4.3 equivalent to this?
edit: I think I found it in the Developer guide
and
should do the trick for orientation, here's a bit on how it works on the next page
Does anybody know the 0.4.3 equivalent to this?
edit: I think I found it in the Developer guide
ovrHmd_ConfigureTracking ( hmd, ovrTrackingCap_Orientation | ovrTrackingCap_MagYawCorrection |
ovrTrackingCap_Position, 0);
ovrTrackingState ts = ovrHmd_GetTrackingState(hmd, ovr_GetTimeInSeconds());
and
Posef pose = trackingState.HeadPose.ThePose;
float yaw, float eyePitch, float eyeRoll;
pose.Orientation.GetEulerAngles <Axis_Y, Axis_X, Axis_Z> (&yaw, &eyePitch, &eyeRoll);
should do the trick for orientation, here's a bit on how it works on the next page
This can be modified using the API call ovrHmd_RecenterPose which resets the tracking origin to the headset’s current location, and sets the yaw origin to the current headset yaw value. Note that the tracking origin is set on a per application basis and so switching focus between different VR apps will switch the tracking origin also. Determining the head pose is done by calling ovrHmd_GetTrackingState. The returned struct ovrTrackingState contains several items relevant to position tracking. HeadPose includes both head position and orientation. CameraPose is the pose of the camera relative to the tracking origin. LeveledCameraPose is the pose of the camera relative to the tracking origin but with roll and pitch zeroed out. This can be used as a reference point to render real-world objects in the correct place.
The StatusFlags variable contains three status bits relating to position tracking. ovrStatus_PositionConnected is set when the position tracking camera is connected and functioning properly. The ovrStatus_PositionTracked flag is set only when the headset is being actively tracked. ovrStatus_CameraPoseTracked is set after the initial camera calibration has
5 Replies
Replies have been turned off for this discussion
- moltonExplorerIn Unity there's OVRPose, does anybody know a c# version of the c++ posef function they mention or another way to extract that yaw, pitch data from OVRPose? Thanks.
edit: ok i think i got it now
OVRPose BirdPose = OVRManager.tracker.GetPose(0.0f);
quaternion birdOrientation = BirdPose.orientation;
I'm pretty sure that's how it's done now
edit:no this doesn't quite work either, it runs though, I guess OVRManager.tracker.GetPose(0.0f); is wrong
i just noticed the geteyepose function in the playercontroller script, that looks like what I'm looking for - sh0v0rProtege
"molton" wrote:
In Unity there's OVRPose, does anybody know a c# version of the c++ posef function they mention or another way to extract that yaw, pitch data from OVRPose? Thanks.
edit: ok i think i got it now
OVRPose BirdPose = OVRManager.tracker.GetPose(0.0f);
quaternion birdOrientation = BirdPose.orientation;
I'm pretty sure that's how it's done now
edit:no this doesn't quite work either, it runs though, I guess OVRManager.tracker.GetPose(0.0f); is wrong
i just noticed the geteyepose function in the playercontroller script, that looks like what I'm looking for
I want to know how to get the absolute camera position now since there used to be a function for that. I'm guessing I should debug log the pose info... - moltonExplorer
"sh0v0r" wrote:
I want to know how to get the absolute camera position now since there used to be a function for that. I'm guessing I should debug log the pose info...
yeah I guess that would be better than simply using the sensor fusion data since if the camera loses tracking the numbers are ususally off but the head tracking motion is still good, I guess this little hiccup will be good for my project as it now forces me to go this route I guess, assuming that is still possible too. Is there a way to just grab the camerarig's rotational data or is that probably a waste of processing compared to the old method? - vrdavebOculus StaffThere are a few functions you may be interested in:
1) OVRManager.display.GetEyePose(eye): Returns the render pose for the given eye, predicted to the time of the next scan-out.
2) OVRManager.display.GetHeadPose(predictionTime): Returns the head pose. This is the sensor fusion data. It comes from the pure C# wrapper OVRManager.capiHmd.GetTrackingState, which you can also use.
3) OVRManager.tracker.GetPose(predictionTime): Returns the pose of the infrared tracking camera relative to the OVRCameraRig. - moltonExplorer
"vrdaveb" wrote:
...2) OVRManager.display.GetHeadPose(predictionTime): Returns the head pose. This is the sensor fusion data. It comes from the pure C# wrapper OVRManager.capiHmd.GetTrackingState, which you can also use...
That's the one I was looking for, thanks.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 5 months ago
- 18 days ago
- 9 months ago
- 2 months ago