Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
BananaHavoc's avatar
BananaHavoc
Explorer
9 years ago

Is Toggling Positional Tracking Broken Again?

Hey all,

I've upgraded our VR project to run on the the 1.3.2 Oculus Utilities, but it appears that having an Oculus camera without positional tracking is broken again. Is this a known bug?

Steps to reproduce:
  1. Open a new Unity project (I used 5.3.5f1 as that is the current suggested version)
  2. Import the Oculus Utilities 1.3.2
  3. Add the "OVRCameraRig" to the scene, along with a cube and/or plane near the camera
  4. Disable "UsePositionalTracking" in OVRManager
  5. Enable "Virtual Reality Supported" in Unity's player settings

When you build and run the scene, you'll see that the camera is in a different spot than it was in the scene, and if you tilt your head, your eyes won't be able to converge on any objects which is very uncomfortable for the viewer. With positional tracking enabled your eyes are able to converge correctly.

Does anyone know of any workarounds or things I might be doing wrong? Because I'd really like to support the CV1.

6 Replies

Replies have been turned off for this discussion
  • vrdaveb's avatar
    vrdaveb
    Oculus Staff
    As of 1.3 (Unity 5.3.4p5 and up), OVRManager.usePositionalTracking disables all positional tracking. In the past, it used to leave the head model and IPD intact, which would screw up 360 video and other use cases where you really need the camera to stop translating with head motion. If you want to restore the old behavior, you can write a script like the one below, which adjusts the tracking space to achieve the desired head pose each frame.

    using UnityEngine;
    using System.Collections;

    public class FakeTracking : MonoBehaviour
    {
    public OVRPose centerEyePose = OVRPose.identity;
    public OVRPose leftEyePose = OVRPose.identity;
    public OVRPose rightEyePose = OVRPose.identity;
    public OVRPose leftHandPose = OVRPose.identity;
    public OVRPose rightHandPose = OVRPose.identity;
    public OVRPose trackerPose = OVRPose.identity;

    void Awake()
    {
    OVRCameraRig rig = GameObject.FindObjectOfType<OVRCameraRig>();

    if (rig != null)
    rig.UpdatedAnchors += OnUpdatedAnchors;
    }

    void OnUpdatedAnchors(OVRCameraRig rig)
    {
    if (!enabled)
    return;

    //This doesn't work because VR camera poses are read-only.
    //rig.centerEyeAnchor.FromOVRPose(OVRPose.identity);

    //Instead, invert out the current pose and multiply in the desired pose.
    OVRPose pose = rig.centerEyeAnchor.ToOVRPose(true).Inverse();
    pose = centerEyePose * pose;
    rig.trackingSpace.FromOVRPose(pose, true);

    OVRPose referenceFrame = pose.Inverse();

    //The rest of the nodes are updated by OVRCameraRig, not Unity, so they're easy.
    rig.leftEyeAnchor.FromOVRPose(leftEyePose);
    rig.rightEyeAnchor.FromOVRPose(rightEyePose);
    rig.leftHandAnchor.FromOVRPose(leftHandPose);
    rig.rightHandAnchor.FromOVRPose(rightHandPose);
    rig.trackerAnchor.FromOVRPose(trackerPose);
    }
    }

  • Thanks for the reply Dave, I really appreciate it.

    I modified that script you provided to maintain the orientation and zero out the position, which allows my eyes to correctly converge on objects. This works, but rotations feel a bit "off", as if my eyes were not rotating about the correct axis. I'm planning on looking through one of the old SDKs to see how I should modify the position to correspond to rotation.

    Do you have any suggestions as to how to modify the position with rotation to properly emulate a head model? Any advice you have would be a huge help.
  • vrdaveb's avatar
    vrdaveb
    Oculus Staff
    Write an Update function in one of your scripts or add something like the following to the beginning of FakeTracking.OnUpdatedAnchors:

    float OVR_DEFAULT_NECK_TO_EYE_HORIZONTAL = 0.0805f;
    float OVR_DEFAULT_NECK_TO_EYE_VERTICAL = 0.075f;
    var position = new Vector3f(0f, OVR_DEFAULT_NECK_TO_EYE_VERTICAL, OVR_DEFAULT_NECK_TO_EYE_HORIZONTAL);
    var rotation = rig.centerEyeAnchor.rotation;
    centerEyePose = new OVRPose(rotation * position, rotation);

  • Yea, I realize that.
    I'm thinking that to properly handle head rotations, there needs to be some deviation from (0,0,0) during a rotation (correct me if this is wrong). Do you have any suggestions for what the relationship between rotation and position should be?
  • vrdaveb's avatar
    vrdaveb
    Oculus Staff
    Sorry, the forum ate my previous post. Please see the new, edited version.
  • I think that did the trick!
    Thank you so much for taking time out of your day to help with this.