Forum Discussion
Seyeco
12 years agoHonored Guest
Help with rotation and gluLookAt()
Hello. I'm messing around with the oculus in c++ with opengl 1.x or 2.x (whatevs win7 ships with). I've been able to muscle through the rendering items, and I can view my test scene just fine. However, I'm pretty lost with the rotation stuffs. Euler angles and quaternions are a new concept for me.
So the surrounding code for my camera is roughly this:
My first assumption was that the Pitch, Yaw, and Roll independently range from -Pi (pitch down, yaw left, roll left) to Pi (pitch up, yaw right, roll right). It works well until Yaw reaches Pi/2 or -Pi/2...when the pitch begins to go crazy... It seems like at that point, the oculus treats the orientation as if it's upside down or something.
My second approach was to try using the Orientation.x/y/z values. They seemed to range from roughly -1 to 1, and I saw somewhere that this could be the angle calculation sin( theta / 2 ). So i tried the following:
I've tried both approaches above using eye angles-only, head angles-only, and eye angles + head angles. There always seems to be a weird issue at yaw == +- 90 degrees.
Can someone help me to understand where I'm derailing?
Thanks very much!!
So the surrounding code for my camera is roughly this:
void Camera( int i ) //i is the eye :)
{
ovrTrackingState TrackingState = ovrHmd_GetTrackingState( Hmd, ovr_GetTimeInSeconds() );
if( TrackingState.StatusFlags & ovrStatus_OrientationTracked )
{
ovrPosef pos = ovrHmd_GetEyePose( Hmd, ovrEyeType(i) );
float headPitch = 0.0f, headYaw = 0.0f, headRoll = 0.0f, pitch = 0.0f, yaw = 0.0f, roll = 0.0f;
OVR::Quatf headOrientation = TrackingState.HeadPose.ThePose.Orientation;
headOrientation.GetEulerAngles <OVR::Axis_X, OVR::Axis_Y, OVR::Axis_Z> ( &headPitch, &headYaw, &headRoll );
OVR::Quatf orientation = pos.Orientation;
orientation.GetEulerAngles< OVR::Axis_X, OVR::Axis_Y, OVR::Axis_Z >( &pitch, &yaw, &roll );
pitch += headPitch;
yaw += headYaw;
roll += headRoll;
GLfloat lx = sin( -yaw ) * cos( pitch ),
ly = cos( yaw ) * sin( pitch ),
lz = -1 * cos( pitch ) * cos( yaw ),
ux = cos( pitch ) * sin( -roll ),
uy = cos( pitch ) * cos( roll ),
uz = sin( pitch ) * cos( roll );
static GLfloat x = 0, y = 0, z = 0;
if( TrackingState.StatusFlags & ovrStatus_PositionTracked )
{
x = pos.Position.x;
y = pos.Position.y;
z = pos.Position.z;
}
gluLookAt( x, y, z,
x+lx, y+ly, z+lz,
ux, uy, uz );
}
}
My first assumption was that the Pitch, Yaw, and Roll independently range from -Pi (pitch down, yaw left, roll left) to Pi (pitch up, yaw right, roll right). It works well until Yaw reaches Pi/2 or -Pi/2...when the pitch begins to go crazy... It seems like at that point, the oculus treats the orientation as if it's upside down or something.
My second approach was to try using the Orientation.x/y/z values. They seemed to range from roughly -1 to 1, and I saw somewhere that this could be the angle calculation sin( theta / 2 ). So i tried the following:
void Camera( int i )
{
ovrTrackingState TrackingState = ovrHmd_GetTrackingState( Hmd, ovr_GetTimeInSeconds() );
if( TrackingState.StatusFlags & ovrStatus_OrientationTracked )
{
ovrPosef pos = ovrHmd_GetEyePose( Hmd, ovrEyeType(i) );
float headPitch = TrackingState.HeadPose.ThePose.Orientation.x,
headYaw = TrackingState.HeadPose.ThePose.Orientation.y,
headRoll = TrackingState.HeadPose.ThePose.Orientation.z;
float pitch = pos.Orientation.x,
yaw = pos.Orientation.y,
roll = pos.Orientation.z;
pitch += headPitch;
yaw += headYaw;
roll += headRoll;
GLfloat lx = -yaw * cos(asin(pitch)),
ly = cos(asin(yaw)) * pitch,
lz = -1 * cos(asin(pitch)) * cos(asin(yaw)),
ux = cos(asin(pitch)) * -roll,
uy = cos(asin(pitch)) * cos(asin(roll)),
uz = pitch * cos(asin(roll));
static GLfloat x = 0,
y = 0,
z = 0;
if( TrackingState.StatusFlags & ovrStatus_PositionTracked )
{
x = pos.Position.x;
y = pos.Position.y;
z = pos.Position.z;
}
gluLookAt( x, y, z,
x+lx, y+ly, z+lz,
ux, uy, uz );
}
}
I've tried both approaches above using eye angles-only, head angles-only, and eye angles + head angles. There always seems to be a weird issue at yaw == +- 90 degrees.
Can someone help me to understand where I'm derailing?
Thanks very much!!
2 Replies
- jhericoAdventurer
"Seyeco" wrote:
My first assumption was that the Pitch, Yaw, and Roll independently range from -Pi (pitch down, yaw left, roll left) to Pi (pitch up, yaw right, roll right). It works well until Yaw reaches Pi/2 or -Pi/2...when the pitch begins to go crazy... It seems like at that point, the oculus treats the orientation as if it's upside down or something.
Manipulating Euler angles is tricky, because for any given orientation there are really an infinite number of pitch, roll, and yaw combinations that result in that orientation. Basically, Euler angles are (sometimes) useful if you want to print out pitch roll and yaw stuff for debugging, or if you're working with a system that's constrained on multiple axes, so you really only have to manipulate one of them. Otherwise, Euler angles should be avoided unless you already have a heavy investment in them."Seyeco" wrote:
My second approach was to try using the Orientation.x/y/z values. They seemed to range from roughly -1 to 1, and I saw somewhere that this could be the angle calculation sin( theta / 2 ).
Quaternions are much better for representing orientations and combining them together, but unless you're really into transformation math, they're something of a black box. The X Y and Z values don't have any clear intuitive relationship with roll, pitch and yaw.
I would suggest that you avoid attempting to use gluLookAt to set the modelview matrix. Converting from a head pose to a particular set of gluLookAt values is really the hard way of doing things.
As an alternative you can represent your modelview matrix directly, and apply the head pose to it using matrix multiplication. I have code here that shows how to convert from the Oculus math types to GLM (a popular C++ math library) types. Instead of using gluLookAt you can take a composed 4x4 matrix and apply it to OpenGL using the glLoadMatrix.
My book goes into this in somewhat more detail and has an appendix devoted to getting familiar with matrix transformations (not sure if the appendix is in the released early access version yet though). - SeyecoHonored GuestThanks for the reply! I appreciate the information, suggestions, and code you've provided. And I look forward to throwing my money at your publisher when the hard copy of your book is available :D
For now, I'm still looking through your code and reading up on Quaternion and Euler angles (and Gimbal Lock). Once I understand these concepts a little better, I'll migrate my code more towards matrix manipulation.
But, for now... I thought I would share my recent success with gluLookAt() in my camera function (for any curious visitor to this post):
void Camera( int i )
{
ovrTrackingState TrackingState = ovrHmd_GetTrackingState( Hmd, ovr_GetTimeInSeconds() );
if( TrackingState.StatusFlags & ovrStatus_OrientationTracked )
{
ovrPosef pos = ovrHmd_GetEyePose( Hmd, ovrEyeType( i ) );
OVR::Quatf Orientation = pos.Orientation;
Orientation.Normalize();
ovrVector3f up = {0,1,0};
ovrVector3f look = {0,0,-1};
up = Orientation.Rotate( up );
look = Orientation.Rotate( look );
static GLfloat x = 0,
y = 0,
z = 0;
if( TrackingState.StatusFlags & ovrStatus_PositionTracked )
{
x = pos.Position.x;
y = pos.Position.y;
z = pos.Position.z;
}
gluLookAt( x, y, z,
x + look.x, y + look.y, z + look.z,
up.x, up.y, up.z );
}
}
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 6 months ago
- 2 years ago
- 4 years ago