Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
aiqingyuyan's avatar
aiqingyuyan
Honored Guest
10 years ago

Is it possible to get the orientation relative to the north?

Hi, I am wondering if it's possible to get the orientation data relative to north from the API? I went through the C API, and sort of got lost.

Does anyone know that if it's possible to pull out the data? Or is there another way to do that, like using the raw sensor data to re-compute the heading relative to north?

Could anyone help me about this?

Many thanks!

2 Replies

  • lamour42's avatar
    lamour42
    Expert Protege
    Hi,

    after reading sensor data and adapting them to your local coordinate system you have a transformation matrix. You then apply this transformation to any vector you want. Just use a vector to whatever north means to you.

    E.g. I define north to be looking straight at the negative z-axis (I use a left-handed coordinate system), so my 'north' vector is (0, 0, -1)

    You may want to look at the relvant method of my camera class. Some details concerning other classes of my framework are missing, but I think you can get the idea. Look at the calculation of 'finalForward' in the code.

    void Camera::recalcOVR(XApp &xapp) {
    // now do it the oculus way:
    // Get both eye poses simultaneously, with IPD offset already included.
    ovrVector3f useHmdToEyeViewOffset[2] = { xapp.EyeRenderDesc[0].HmdToEyeViewOffset, xapp.EyeRenderDesc[1].HmdToEyeViewOffset };
    ovrPosef temp_EyeRenderPose[2];
    ovrHmd_GetEyePoses(xapp.hmd, 0, useHmdToEyeViewOffset, temp_EyeRenderPose, NULL);

    for (int eye = 0; eye < 2; eye++)
    {
    ovrPosef * useEyePose = &xapp.EyeRenderPose[eye];
    float * useYaw = &xapp.YawAtRender[eye];
    float Yaw = XM_PI;
    *useEyePose = temp_EyeRenderPose[eye];
    *useYaw = Yaw;

    // Get view and projection matrices (note near Z to reduce eye strain)
    Matrix4f rollPitchYaw = Matrix4f::RotationY(Yaw);
    Matrix4f finalRollPitchYaw = rollPitchYaw * Matrix4f(useEyePose->Orientation);
    // fix finalRollPitchYaw for LH coordinate system:
    Matrix4f s = Matrix4f::Scaling(1.0f, -1.0f, -1.0f);
    finalRollPitchYaw = s * finalRollPitchYaw * s;

    Vector3f finalUp = finalRollPitchYaw.Transform(Vector3f(0, 1, 0));
    Vector3f finalForward = finalRollPitchYaw.Transform(Vector3f(0, 0, -1));
    Vector3f Posf;
    Posf.x = pos.x;
    Posf.y = pos.y;
    Posf.z = pos.z;
    Vector3f diff = rollPitchYaw.Transform(useEyePose->Position);
    Vector3f shiftedEyePos;
    shiftedEyePos.x = Posf.x - diff.x;
    shiftedEyePos.y = Posf.y + diff.y;
    shiftedEyePos.z = Posf.z + diff.z;
    look.x = finalForward.x;
    look.y = finalForward.y;
    look.z = finalForward.z;

    Matrix4f view = Matrix4f::LookAtLH(shiftedEyePos, shiftedEyePos + finalForward, finalUp);
    Matrix4f projO = ovrMatrix4f_Projection(xapp.EyeRenderDesc[eye].Fov, 0.2f, 2000.0f, false);
    Matrix4fToXM(this->viewOVR[eye], view.Transposed());
    Matrix4fToXM(this->projOVR[eye], projO.Transposed());
    }
    }

  • If you are referring to compass directions as in "Earth magnetic north" then no, the SDK doesn't supply that type of sensor fusion.