Forum Discussion
Sebbi
13 years agoHonored Guest
Magnetic yaw correction
So I downloaded the SDK and looked over what you guys are doing in regards to sensor fusion. It looks really basic (having seen what several multicopter projects are doing to calculate orientation) and the magnetometer doesn't get used at all.
Now I don't have a Rift yet, but I imagine that drifts in the yaw axis and should be visible after some time. Any plans to use the magnetometer or are you open to suggestions? ;-)
P.S.: If EnableGravity is set to false, there doesn't seem to be any normalisation of Q.
Now I don't have a Rift yet, but I imagine that drifts in the yaw axis and should be visible after some time. Any plans to use the magnetometer or are you open to suggestions? ;-)
P.S.: If EnableGravity is set to false, there doesn't seem to be any normalisation of Q.
35 Replies
- edziebaHonored GuestI believe the magnetometer is already in use as part of the sensor fusion and motion prediction that goes on in the SDK.
If the local magnetic field is not stable enough (e.g. you have a lot of electric motors running nearby), you might be able to put a big Helmholtz Coil on opposite walls (or go full Maxwell if you want a really uniform field), and sit within the relatively uniform and stable solenoid field set up between them. You don't need a strong field, so you could run them off a USB PSU or a battery.
::EDIT::
Whoops, fusion of the magnetometer data is not happening quite yet:Magnetometer fused drift correction did not quite make it into this release, so no drift correction was being demonstrated at GDC, but I can tell you that generally magnetometer fusion does a very good job of stabilizing yaw. It's currently listed as an Upcoming Feature of the SDK and is a high priority.
- RobertHonored GuestI would like to know what algorithm is used to combine the raw sensor data from the gyroscope, the accelerometer and the magnetometer to measure orientation?
- MrGeddingsExplorerE=MC2?
Kidding :-) - bteitlerHonored GuestI don't yet have my developer kit, but it would be nice if someone could do a precise measurement of the yaw drift encountered over time for various different types of motion patterns (i.e. no movement, slow movement, crazy random movements, etc). I will do it as soon as I get mine if no one else does. This is a very important topic to me as I will be trying to build a virtual reality "room" where your position is tracked, which means drift must be kept within 1 or 2 degrees to allow virtual movement to coincide with reality. I'm guessing that right now the solution is to hit a button to reset angle (i.e. look forward and hit the button).
I'll be interested to see how their yaw correction performs in my apartment when it is implemented. However, I have very realistic expectations as all magnetometer correction implementations I have tried have performed unacceptably. I have had similar problems to those described here (viewtopic.php?f=20&t=77).
For instance, SpacePoint sent me one of their 9-axis sensor models a while back (I believe it is an early prototype of this: http://pnicorp.com/gaming/SpacePoint_Scout). They make pretty bold claims about tracking accuracy, but it simply did not work. I walked around in a 12x12 foot space and wiggled it around for a few seconds and it was already off by multiple degrees.
I have also done thorough testing of the Core Motion APIs for iOS, specifically using the iPad 2. Sampling rates are somewhat low (up to 120Hz), and the magnetometer correction again does not perform well. With or without the magnetometer correction, the drift is around 0.2 degrees a second in the average case in my apartment with random motion patterns / locations. I'm sure the Oculus VR will perform much better than iOS since it supports 1000Hz sampling according to the docs.
Regardless of how the Oculus performs in this area, people wishing for the least amount of drift over long periods will need an external tracking solution to "pull" the yaw back to its true starting reference. A few people have mentioned using a camera based tracking solution such as "FreeTrack", which should correct the issue for most use cases (yaw is simply pulled towards the external angle measurement at a very slow rate to not interfere with actual movements). You don't even need to be able to consistently measure the angle, since drift is slow, so the solution does not need to be robust (one frame every few seconds is sufficient). The simplest setup here is to stick an AR (augmented reality) type marker on the outside of the rift and compute the orientation in real time using any camera and OpenCV (or whatever image processing library you prefer). This is the solution I use in my own projects. The Oculus team is full of pretty smart people, so I'd be really surprised if they didn't ship the consumer version with an external tracking solution of some sort. - ganzuulHonored Guest
"Robert" wrote:
I would like to know what algorithm is used to combine the raw sensor data from the gyroscope, the accelerometer and the magnetometer to measure orientation?
The comments in the SDK mentions they get gain to the gyro from the accelerometer, so it's certainly not a naive implementation. This 'gain' is most likely the type that one talks about in conjunction with Kalman filters, "Kalman gain".
I'm very far from an expert in the field, but in my understanding Kalman filters are quite old hat. - It is matrix math for an abstract case where a 'physical model' doesn't resemble the Newtonian stuff we learned in school. In the code there is a constant of gravity defined, a measurement, which implies we take a great, great leap from the realm of mathematics to the realm of physics.
I look forward to studying the code in detail. Good times! - SebbiHonored GuestFrom my observations ...
This is their code to integrate the gyro rates:
Vector3f dV = AngV * msg.TimeDelta;
const float angle = dV.Length(); // Magnitude of angular velocity.
float halfa = angle * 0.5f;
float sina = sin(halfa) / angle;
Quatf dQ(dV.x*sina, dV.y*sina, dV.z*sina, cos(halfa));
Q = Q * dQ;
Straight forward and pretty basic. There is no Kalman filtering going on, but they use the accelerometer data (if in a certain range of 1G) to correct the current orientation quarternion. I have never seen this method used before, but it obviously only works for pitch and roll drift correction, since it's only using the accelerometer.
A = msg.Acceleration * msg.TimeDelta;
Vector3f yUp(0,1,0);
Vector3f aw = Q.Rotate(A);
Quatf qfeedback(-aw.z * Gain, 0, aw.x * Gain, 1);
Quatf q1 = (qfeedback * Q).Normalized();
float angle0 = yUp.Angle(aw);
float angle1 = yUp.Angle(q1.Rotate(A));
if (angle1 < angle0)
{
Q = q1;
}
else
{
Quatf qfeedback2(aw.z * Gain, 0, -aw.x * Gain, 1);
Quatf q2 = (qfeedback2 * Q).Normalized();
float angle2 = yUp.Angle(q2.Rotate(A));
if (angle2 < angle0)
{
Q = q2;
}
}
External yaw drift correction would be nice, especially with an AR marker or Nintendo Wii controller type camera, since it would also allow you to do limited positional tracking in front of your PC. The accelerometers (no MEMS one is) aren't as good as they need to be to use them for position calculation by themselves, but in combination with a camera ... why not?
However, I come from the multicopter area and the algorithms used in those copters are pretty good, stable and open source. They are optimised for speed, because they have to run at 200+ Hz on an 8bit µC and the Rift should probably go for accuracy instead, but inspiration (some use Kalman filters, some use an AHRS solution, others use the gyro rates to rotate the gravity vector, etc) can come from this space. And not only in regards to algorithms. There are copters which use a modified mouse cameras for optical flow measurements to hold their position in space (the AR drone is a commercial example) ;-)
By "stable" I mean that the compass drift compensation in those copters works if properly calibrated. Even in front of my desk and with running motors around it. That shouldn't be a problem. - bteitlerHonored Guest
"Sebbi" wrote:
By "stable" I mean that the compass drift compensation in those copters works if properly calibrated. Even in front of my desk and with running motors around it. That shouldn't be a problem.
You may simply be lucky enough to have a consistent magnetic field in your work-space. I think most people who experiment at home (myself included) find this is not the case. I would also challenge you to quantify exactly what you mean by "works". To me, that means that you can fly the device around in a variety of positions and orientations, and there is less than 2 degrees of yaw error at any time for periods of 5 minutes or longer (and no external tracking is allowed, such as optical flow). If you could demonstrate this on a copter with cheap hardware I'd be really interested to see what your room looks like , as well as the exact algorithm. - SebbiHonored GuestMaybe I am lucky, I don't know. I don't have any changing magnetic fields in front of my PC and compasses show true north quite well and it's definetly usable to correct for yaw drift of the gyros. I mainly use MultiWii which suffers from gimbal lock, but has zero drift when properly calibrated. The heading toggles, but that's because they round it to full integers. So judging from that it's accurate to less than 2 degrees.
Here is a Github mirror of their repository:
https://github.com/multiwii/multiwii-firmware
(Attitude estimation is going on in IMU.ino)
P.S.: Does the rift have a calibration function? There has to be something for the gyros at least, since they will have varying bias with temperature. - bteitlerHonored GuestIf you get up and walk 10 feet somewhere else, does the compass still read within 2 degrees of where it was before? Did you test the drift over long periods of time (5, 10 minutes)? I looked at the code, and they aren't doing anything I haven't seen before. They trust gyro 250 times more than compass (GYR_CMPFM_FACTOR) in the complementary filter, but the true convergence rate will likely depend on how often that attitude function is called.
My guess is that either the gyros just have low drift during whatever period you measured, and you should test both with mag off and on to see if it makes any difference, or you truly do have a magic room, which I'd like to rent. The top rated Android compass app reports a differences of 120 degrees when I move my phone one foot over on my desk. - SebbiHonored GuestWalking around and measuring the accuracy would be a problem for me since I find it very hard to tell if the copter still faces in the same direction as before. If you mean I should walk around and place the copter on the same spot as before (again, not easy to do that within 1 degree), then yes, the heading is the same.
On an Arduino the function is called roughly 300 times per second (500+ times on 32 bit hardware), so it converges pretty fast.
I let it run for the past 15 minutes and nope, the heading isn't moving (also the raw magnetometer readings are pretty stable, too). I am using a board called FreeIMU which has the MPU6050 gyro/acc and a HMC5883 magnetometer onboard. Unfortunatly I can't test with mag off, since MultiWii isn't calculating a heading without it (doesn't make sense for a copter).
Maybe I have a magic room. Does your Android compass work outside? Mine (Nexus 4) - jumps +-3 degrees when on my desk ... I guess Androids sensorfusion isn't the best (magnetometer is practically overriding whatever the gyro is saying is happening), but 120 degree jumps, are you working under a high voltage power line? ;-)
Anyway, for most people magnetic yaw drift correction should work ok. But I am looking forward to those external solutions developers will come up with (attach a Razr Hydra to the Rift?), because they can also be used to get positional tracking.
Update:
20 minutes later, still no perceivable drift.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 13 years ago
- 13 years ago
- 13 years ago