cancel
Showing results for 
Search instead for 
Did you mean: 

Using Oculus on motion platform impair head tracking

Vrally
Level 4
I am trying to use the Rift on an motion platform (a 6-dof Stewart platform). The trouble I am having is that the accelerations from the motion platform gets picked up but the accelerometers in the Rift. This results in the sensor fusion thinks there is movement even though the user is keeping his head completely still inside the cockpit. The Oculus camera detects this aggressive drift and corrects for it, but this is done in a much slower rate that the drift occurs, which result in extreme "judder". Especially rotational drift correction becomes very apparent. One user described it as if the frame rate was reduced to ~2 Hz.

So how can this be solved? Well since the motion platform is already fitted with its own accelerometers, we know the acceleration (both linear and rotational) of the platform. If somehow this information could be feed to the sensor fusion of the Oculus, the accelerations from the platform could be subtracted from the accelerations from the HMD. Thus eliminating this drift.

Is there a way to supply the sensor fusion with this type of corrections?
13 REPLIES 13

RiftFlyer
Level 4
I gave up on my plans for a 6dof flight sim platform for this very reason. I'm now working on a non moving G-Seat instead.

Work has previously been done on your issue. Although I think it was with DK1. Have a look here viewtopic.php?f=29&t=7933

Vrally
Level 4
"Frusheen" wrote:
I gave up on my plans for a 6dof flight sim platform for this very reason. I'm now working on a non moving G-Seat instead.

Work has previously been done on your issue. Although I think it was with DK1. Have a look here viewtopic.php?f=29&t=7933


Well, the solution in that thread uses one additional external sensor and a patched version of the Oculus SDK. I would prefer not to hack the SDK, but if that is whats needed I am open to that as well. I am currently looking at the few exposed part of the SDK concerning tracking. But I cant really tell if enough of the tracking internals are exposed to be able to correct for this type of drift.

I would love if the final version of the Rift would include an accelerometer inside the camera to be able to correct for these types of external motions.

Vrally
Level 4
I have been pondering if it would be feasible to hack the calcPredictedPose() function inside LibOVR/Src/Tracking/Tracking_SensorStateReader.cpp.

But as I see it any correction you make here will be overwritten by the data from closed source part of the SDK when camera intermittently corrects its orientation and position.

Any ideas are appreciated...

MrZofka
Level 2
Hi,

how did you get it working while driving the vehicle, as posted in your blog?

Best regards,

Marc

Vrally
Level 4
"MrZofka" wrote:
how did you get it working while driving the vehicle, as posted in your blog?


Well, the short answer is that I didn't.

A longer answer would be that the DK1 lacked the camera, which made the corrections on yaw depend on the magnetometer. Which in turn made it possible to perform manouvers with quite high rotatational accererations.

So the manouvers we performed was limited by the magnetometer, i.e. no big changes in compass direction (roughly +-45 degrees was ok).

The camera in the DK2 corrects much faster, which for the user becomes quite extreme stuttering in orientation.

RiftFlyer
Level 4
You could try contacting vectionvr. I read they have had some success with LFS and Prepar3d.

http://www.vectionvr.com

Imeleon
Level 5
I use a 2dof motion sim with the dk2 and I don't have any issues with that at all. I can crank the motion right up and it all stays smooth. I don't though as you only need smaller movements in vr. I did turn the bump simulation down a bit as it shook the screen too much.
I've been dk2 and motion driving in iracing for months now.

Vrally
Level 4
"Imeleon" wrote:
I use a 2dof motion sim with the dk2 and I don't have any issues with that at all. I can crank the motion right up and it all stays smooth. I don't though as you only need smaller movements in vr. I did turn the bump simulation down a bit as it shook the screen too much.
I've been dk2 and motion driving in iracing for months now.


Well, I can easily provoke a really clear stutter by simply moving the camera and Rift in 1-DoF. I made the following simple setup, where I fixed both the Rift and camera to a wooden board (the blue thing to the right is a inertial sensor):


When I exerted some simple yaw motions (rotating the board back and forth) and logged the orientation from the Oculus SDK. If I plot the yaw signal I get the following:


Note, since the camera and Rift are fixed on the board the yaw signal should be zero. . And we can clearly see when the camera decides to correct for error in estimated yaw angle.

But I have found a solution that works decently. Since I know the external inertial forces I can compensate for the in my draw loop. I integrate the external inertial forces and subtract the external rotations from the orientation signal from the rift. This integrated signal will grow in the same rate as the error from the Rift SDK. This works well until the camera does its corrections. This correction can be detected by looking at the derivative of the angle from the Rift SDK. When I detect a camera correction I reset the integrator, since we can assume that the Rift camera has eliminated the error build up.

This strategy works surprisingly well, although I have to disable "Time-warp", since the time warp algorithm does its calculations inside the SDK and is unaware of the data from my correction algorithm.

eichhorn
Level 2
Hi! Interesting to see a solution to the problem. So how well does it work? Do you still still see small jumps whenever the recalculation happens? Especially if you move (translate) your head around.