Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
HWiese's avatar
HWiese
Explorer
12 years ago

[edited] Parallel cameras - question about convergence?

Hi folks,

meanwhile I've managed to couple two webcams with the Rift by using Willow Garage's ROS, the oculus_driver by OTL (GitHub project), the usb_cam node... however, before I get lost in details, let's just say I've managed to present appropriately distorted pictures of the webcams on the Rift. It works. Well, somehow. There's one issue that I'd like to discuss, that I need help with. Otherwise I wouldn't have to write this post (I could have anyways just to keep you informed about my progress)...

I'm babbling, so long story kind of short...

The two cameras are mostly identical. ROS rectifies the raw images of the two cameras and distorts the rectified images for the Rift. But there's a problem: the pictures don't overlap. It's like squinting to the outside... reverse squinting... call it whatever you like... the cameras are mounted to the carrier unit so that their fixpoints on the horizon have the same distance as the cameras, their lines of view are parallel, they never cross.

I can solve the problem of not overlapping images by turning the cameras towards each other and kind of fixate a specific point which then perfectly overlaps in the images and is visible in perfect 3D. But that is only valid for the objects at the fixpoint.

So, the question actually is: how do games solve this problem that only a point at a certain distance from the player (where the lines of view of the two cameras cross) overlaps in a way our brains can use to make it appear in 3D and the objects between the player and the fixpoint respectively the fixpoint and the horizon appear doubled? The Rift does not provide any information about eye movement. I cannot mechanically move the cameras to always fixate to the object the user is just looking at.

I hope I could explain my problem understandably. It's a bit difficult... for illustration purposes I've added two pictures (hopefully) showing what I mean...

Can anybody tell me what I'd have to do with the pictures of the cameras to solve this issue?

//edit 1: found out that the actual term for this apparently is convergence. To me it looks like the two distorted pictures are simply too far apart. I seem to have some kind of horizontal offset. Can this be caused by a difference between the camera's FOV and the FOV of the Oculus Rift? The two cameras that I use definitely have a different FOV.

Thanks a lot!

Cheers,
Hendrik

2 Replies

  • Just to answer this thing myself... I've figured out what was wrong with my setup. The two cameras were swapped. Strangely. Because I know that I've had both cameras on the correct USB port and - through Linux udev - also the correct /dev files. However, they were swapped, I switched them and now everything is working. Well... aside from another problem that is due to the distance between the two cameras. They are too far apart. Things very close and things far away still are seen double. But that's just a matter of adjustment.

    It's working! I can see myself from a distance! It's a strange feeling, kind of like an out-of-body experience... :mrgreen: :mrgreen: :mrgreen:
  • The way the depth is perceived is through parallax. That separation of images is the parallax. The fact you are seeing to images means there is too much parallax and your eyes cannot fuse it. Even in they eyes are flipped they should still fuse. Are you sure the cameras are aligned properly? Be certain your cameras are parallel, they could have been deverged (both pointing away from eachother). Try taking them outside and align them to a far away building, this will affix them parallel.