Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Jossos2's avatar
Jossos2
Honored Guest
11 years ago

Need help with SDK with opengl c++

Hello. I'm trying to make my own game engine, and trying to implement the oculus. I have rotations and translations being inputted which is awesome, but I'm still having a bit of trouble with a few things, which I will list below: ANY help is greatly appreciated.

1.
I want to create the distorted eye mesh that the image is mapped onto - how do I do this? Like I actually want to write the loop that creates it. I looked through the source code, but couldn't find where the vertices are defined. I saw something which seems to specify the uv coordinates for each rgb value - is this right? each vert will have 3 uv coordinates for each color value? Also, what is the function to generate the vert positions? I assume uv coords will just be 1.0 - (currentVert / vertsAcross), and then shifted for each value. I want to generate this myself because I saw in the source code it looks like it generates the mesh every frame. I could be wrong, but that's what it looked like, and I'm just gonna generate the mesh at the start, and reuse it.

2.
I'm having some trouble with rotations. Like on the x axis, the headset works fine until it hits a certain angle, and then it reverses back? Also y and z seem a bit off, probably because my converter is wrong.

I'm using this to get the angles - might be what's wrong, but it does give some workable results:

void VRDATA::QuaternionToDegrees(const ovrQuatf& q, VECTOR3f& rotData)
{
float radToDegree = 57.2957795;

rotData.y = atan2((float)(2.0f * q.x * q.w + 2.0f * q.y * q.z), (float)(1.0f - 2.0f * (q.z * q.z + q.w * q.w))) * radToDegree;
rotData.x = asin(2.0f * (q.x * q.z - q.w * q.y)) * radToDegree;
rotData.z = atan2((float)(2.0f * q.x * q.y + 2.0f * q.z * q.w), (float)(1.0f - 2.0f * (q.y * q.y + q.z * q.z))) * radToDegree;
}

And I use it like so:

VECTOR3f rotation;
ovrTrackingState trackingState = ovrHmd_GetTrackingState(hmd, ovr_GetTimeInSeconds());
QuaternionToDegrees(trackingState.HeadPose.ThePose.Orientation, rotation);

3.
This is more of a none-rift problem, but I figure I'll ask anyway.
i have rotate and translate functions (not old opengl, i redid them for modern version), wondering what order I should be doing them? I do all the rotations first, and the translations last. is this right?

4.
How Do I get a float value of the Fov? Having trouble with this one for some reason.


Overall, I'm really just wanting to use the sdk as a means of grabbing the positions/rotations, and then having the rest be my own code, for reasons.

Thanks!!!

(Sorry, 'code' button not working?)

6 Replies

  • 1. The function you want to get the distortion mesh vertices is ovrHmd_CreateDistortionMesh. If you really want to create your own distortion mesh instead of allowing the SDK to do distortion you should look at the Oculus Room Tiny sample, which includes code for both SDK and client side distortion
    2. Using Euler angles for maintaining rotations is filled with peril. Use quaternions or matrices to store and compose rotations.
    3. If you want to compose a 4x4 matrix out of a rotation and a translation, yes, the order they should be applied depends on how your values should be interpreted. If you have a rotation that turns you left and a translation that moves you back along the +Z axis, then applying the rotation and then the translation will leave you looking at the origin of the scene from the +X axis. More typically, you apply the translation first and then the rotation.
    4. Fetch the DefaultFovPorts values out of the ovrHmd structure.


    Also you could buy my book on Oculus Rift development: http://manning.com/bdavis
  • 1) The SDK should give you a distortion mesh at the beginning. You would just need to create a vertex buffer with the mesh and then map your render target to this. Or you can go with the SDK rendered option, which means you just pass in a undistorted render target and the SDK handles the rest.

    2) The SDK comes with functions to get Euler angles from a quaternion. Like this:

    Posef pose = ts.HeadPose.ThePose;
    float yaw, pitch, roll;
    pose.Rotation.GetEulerAngles<Axis_Y, Axis_X, Axis_Z>(&yaw, &pitch, &roll);

    cout << "yaw: " << RadToDegree(yaw) << endl;
    cout << "pitch: " << RadToDegree(pitch) << endl;
    cout << "roll: " << RadToDegree(roll) << endl;


    3) I believe you do rotation first (at least you usually do with DirectX).

    4) You can get the fov (per eye) like this:

    hmd->DefaultEyeFov[0]


    See the included PDFs with the SDK for more details.
  • Jossos2's avatar
    Jossos2
    Honored Guest
    Thanks.

    A few problems:

    The program doesn't recognize Pose<float> or Posef as a type, despite having '#include <Kernel\OVR_Math.h>' a few lines above it.

    And for getting the perspective I get:
    error C2440: 'type cast' : cannot convert from 'const ovrFovPort' to 'float'

    EDIT: I see there are 4 different values in this variable, I have a function similiar to deprecated glPerspective function that takes a single perspective value in degrees. What kind of values are these? all 1.something
  • "Jossos2" wrote:
    The program doesn't recognize Pose<float> or Posef as a type, despite having '#include <Kernel\OVR_Math.h>' a few lines above it.


    All of the types in the SDK are in the namespace OVR. So try OVR::Posef.


    "Jossos2" wrote:

    And for getting the perspective I get:
    error C2440: 'type cast' : cannot convert from 'const ovrFovPort' to 'float'

    EDIT: I see there are 4 different values in this variable, I have a function similiar to deprecated glPerspective function that takes a single perspective value in degrees. What kind of values are these? all 1.something


    You should be getting the full perspective matrix using the OVR C API ovrMatrix4f_Projection, which takes as input an ovrFovPort, along with a near and far plane. It will return the exact projection matrix you should be using.
  • Jossos2's avatar
    Jossos2
    Honored Guest
    EDIT: Figured it out lol

    "nuclear" wrote:
    "Jossos2" wrote:
    Just to clarify - UpTan and DownTan are radians


    No they are the tangent of the half-angles as their name implies.

    "Jossos2" wrote:
    and I add them together to get the yFov? for PerspectiveLH(T yfov, T aspect, T znear, T zfar)


    It's not that simple. Your typical gluPerspective-like functions construct symmetrical frustum projection matrices. In the rift, the projection is assymetric, because the lens centers are not centered over their half of the screen. You can of course create a symmetric projection matrix and then use translation to shift it over to the right shape.

    I don't really know why you insist on creating the projection matrix yourself. As Jherico said, the SDK just gives you the correct matrix when you call ovrMatrix4f_Projection. However if you insist, you should look in the SDK code to see how it generates the projection matrix. Specifically ovrMatrix4f_Projection calls CreateProjection in OVR_Stereo.cpp, which in turn relies on CreateNDCScaleAndOffsetFromFov in the same file.


    Yeah sorry, I figured it out as you were writing your response. I'm not super great at this
  • "Jossos2" wrote:
    Just to clarify - UpTan and DownTan are radians


    No they are the tangent of the half-angles as their name implies.

    "Jossos2" wrote:
    and I add them together to get the yFov? for PerspectiveLH(T yfov, T aspect, T znear, T zfar)


    It's not that simple. Your typical gluPerspective-like functions construct symmetrical frustum projection matrices. In the rift, the projection is assymetric, because the lens centers are not centered over their half of the screen. You can of course create a symmetric projection matrix and then use translation to shift it over to the right shape.

    I don't really know why you insist on creating the projection matrix yourself. As Jherico said, the SDK just gives you the correct matrix when you call ovrMatrix4f_Projection. However if you insist, you should look in the SDK code to see how it generates the projection matrix. Specifically ovrMatrix4f_Projection calls CreateProjection in OVR_Stereo.cpp, which in turn relies on CreateNDCScaleAndOffsetFromFov in the same file.