Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Crabat's avatar
Crabat
Honored Guest
11 years ago

Oculus DK2: Augmented Reality project

Hello Oculus dev community!

I've been lurking these forums for some time now and finally got to point where I'm able to "contribute" some questions by myself.

I'm currently working on a project which is all about using the DK2 for an augmented reality application. For this, I am using a DK2 with an OVRVision in front, and a third party electromagnetic tracking solution for objects. The connecting framework is Unity 5.

Th idea consists mainly of having the DK2 camera on a fixed position relative to the second tracker and using this to sync those two coordinate systems. We're then placing objects, which are 1:1 virtual representations of the to be tracked physical objects, in front of the OVRVision camera feed textures to achieve a projection like effect where the real object gets overlapped by the virtual one.

The project is quite young and currently we are still in a heavy prototyping phase. Being new to all this VR stuff, there are some questions that came up during our tests where I thought it would be a good idea to have them answered before we advance some more.

1. How secure is the absolute tracking point the camera uses to determine the position of the HMD? We are noticing that the measured position of the HMD tends to drift upwards during runtime which persists even after re-running the simulation. Only after completely restarting the service the offset gets reset. Somebody else also got that problem?

2. Where exactly is the point on the HMD which gets measured by the positional tracking and is it also the pivotal point for the orientation? After taking a look into the code I understand that we get a pose for each eye. Where are these virtual eyes exactly on the HMD?

--OVRVision related--
3. I'm still looking for a way to measure how "good" a calibration is. Somebody got any tips?

I tried searching the forums for augmented reality based OVR projects but didn't get lucky. If somebody got some more resources or could lead the way it would be really appreciated!
Thanks in advance guys!

6 Replies

  • 1. sorry no idea, I have experienced no such drift in my AR projects. Although they never tend to run then longer than 10 minutes xD.

    2. If you look at the DK2 through a camera (while a rift app is running) you can clearly see all the IR LEDs that the IR camera is tracking. There are 40 LEDs I believe. This article here explains how the IR camera and LEDs work http://doc-ok.org/?p=1095, make sure to read part 2, 3 and 4 as well. They're all very enlightening.

    Page 16 of Oculus_Getting_Started_Guide_0.5.0.pdf under advanced settings.
    The positions of the virtual cameras are defined in the user settings in the config tool. Specifically, the IPD setting determines the horizontal distance between cameras, and the 'Measure' button calculates how close the eyes are to the lenses/screen.
    The height is assumed to be exactly halfway up the lenses.

    I'm not sure if the cameras are placed in the center of the users eyeball, or on the pupil. I assume whichever is more correct.
    But be aware that these measurements are distinct for each person.

    3. I don't understand what a 'calibration' is.
  • "Crabat" wrote:

    3. I'm still looking for a way to measure how "good" a calibration is. Somebody got any tips?


    If you speak of camera calibration, you could read:

    A Flexible New Technique for Camera Calibration - Zhengyou Zhang
    http://research.microsoft.com/en-us/um/ ... R98-71.pdf
  • Crabat's avatar
    Crabat
    Honored Guest
    "PhyterJet" wrote:
    1. sorry no idea, I have experienced no such drift in my AR projects. Although they never tend to run then longer than 10 minutes xD.


    Yeah, in the beginning we tought it would be caused by errors in our unity integration but, like I said, restarting the oculus service fixes it.

    "PhyterJet" wrote:

    2. If you look at the DK2 through a camera (while a rift app is running) you can clearly see all the IR LEDs that the IR camera is tracking. There are 40 LEDs I believe. This article here explains how the IR camera and LEDs work http://doc-ok.org/?p=1095, make sure to read part 2, 3 and 4 as well. They're all very enlightening.


    Thank you very much, will do!

    "PhyterJet" wrote:

    Page 16 of Oculus_Getting_Started_Guide_0.5.0.pdf under advanced settings.
    The positions of the virtual cameras are defined in the user settings in the config tool. Specifically, the IPD setting determines the horizontal distance between cameras, and the 'Measure' button calculates how close the eyes are to the lenses/screen.
    The height is assumed to be exactly halfway up the lenses.

    I'm not sure if the cameras are placed in the center of the users eyeball, or on the pupil. I assume whichever is more correct.
    But be aware that these measurements are distinct for each person.


    Looks like l missed this in some way, thanks! Since we are working with the OVRVision we tried to place the virtual cameras with a positional offset comparable to their physical location on the HMD.
    Playing around with parameters during runtime we managed to get a quite solid overlap... but sadly only from that fixed view. Any kind of movement messes it up again. We currently think that either we are not handling the orientation correctly or our camera calibration (to unwarp the view) is messy which lead me to the follow up question.

    "PhyterJet" wrote:

    3. I don't understand what a 'calibration' is.


    Before you can get to work with the OVRVision you need to do a calibration process to unwarp the camera view consisting of making 25 pictures of a chess pattern on your monitor. This will get you an xml file with values needed to get the OVRVision running in Unity. The results of this calibration process can vary wildly so I thought I'd ask how you can measure the "unwarpty-ness" of the resulting calibration settings.

    "pixelminer" wrote:
    "Crabat" wrote:

    3. I'm still looking for a way to measure how "good" a calibration is. Somebody got any tips?


    If you speak of camera calibration, you could read:

    A Flexible New Technique for Camera Calibration - Zhengyou Zhang
    http://research.microsoft.com/en-us/um/ ... R98-71.pdf


    Thank you kind sir, I'll check it out!
  • "Crabat" wrote:

    Since we are working with the OVRVision we tried to place the virtual cameras with a positional offset comparable to their physical location on the HMD.

    Playing around with parameters during runtime we managed to get a quite solid overlap... but sadly only from that fixed view. Any kind of movement messes it up again. We currently think that either we are not handling the orientation correctly or our camera calibration (to unwarp the view) is messy which lead me to the follow up question.


    Well, I have seen this myself. The problem is seems to be that tracking origin gets modified during runtime. Normally this should only happen if you trigger the RecenterPose() function in the Oculus SDK. But it seem to happen spontaneously sometimes, which will lead to messed up calibration for any AR application trying to use the tracking data from the Oculus. I have posted a question to the guys at Oculus about this, but still no answer:
    viewtopic.php?f=20&t=20721
  • Crabat's avatar
    Crabat
    Honored Guest
    "pixelminer" wrote:

    Well, I have seen this myself. The problem is seems to be that tracking origin gets modified during runtime. Normally this should only happen if you trigger the RecenterPose() function in the Oculus SDK. But it seem to happen spontaneously sometimes, which will lead to messed up calibration for any AR application trying to use the tracking data from the Oculus. I have posted a question to the guys at Oculus about this, but still no answer:
    viewtopic.php?f=20&t=20721


    Interesting. Our current solution for this is attaching a sensor from the second tracking device on a measured position on the HMD. With the positional and orientional information from this sensor we then compare this data with the one we get from the oculus and move the virtual default tracking point accordingly to remove that offset. The thing is, we aren't too sure where the physical "center" of the oculus is and if it's always the same between simulations. What we've seen so far looks like the more we move the oculus from the position where we synced, the bigger the offset gets again... Not as easy as I thought at first! :D

    I'll keep an eye on your thread thought, hopefully we get an answer for that soon!
  • callard's avatar
    callard
    Honored Guest

    Hi Crabat,

    I am working with Oculus CV1 and OvrVision pro too.

    It seems I am following your solution to get a mixed reality application.

    I am asking if you find a way to get the real position of the camera on the Oculus ?

    My experience shows the video is drifting on my virtual object. I assume this comes from the delta between the real camera and Oculus position. right ?

    Thanks for your help.