Forum Discussion
racerx3
11 years agoHonored Guest
3D Camera Lens pictures viewable in Rift?
Sorry for the new thread, but my searches are returning ambiguous answers.
Very simply: If I use a 3D camera lens to take a photograph, can I use VR player (or anything) to view the file in realistic 3D? OR, are the lenses too close to one another to get the correct perspective difference between the left/right eyes? Is the IPD a function of software adjustment, hardware, or both?
Example lens: http://www.amazon.com/Panasonic-12-5mm-3D-Interchangeable-Cameras/product-reviews/B0043VE292/ref=dp_top_cm_cr_acr_txt?showViewpoints=1
Thanks, guys.
Very simply: If I use a 3D camera lens to take a photograph, can I use VR player (or anything) to view the file in realistic 3D? OR, are the lenses too close to one another to get the correct perspective difference between the left/right eyes? Is the IPD a function of software adjustment, hardware, or both?
Example lens: http://www.amazon.com/Panasonic-12-5mm-3D-Interchangeable-Cameras/product-reviews/B0043VE292/ref=dp_top_cm_cr_acr_txt?showViewpoints=1
Thanks, guys.
28 Replies
- LegacyToolingHonored GuestA 3d camera lens like the one linked is unlikely to give you a satisfactory 3d effect because the lenses are too close together. IPD is the distance between the eyes and for a realistic 3d effect, the lenses need to be approximately the same distance apart as our eyes are. My understanding is that if the distance between the lenses is less than the distance between our eyes, the 3d effect is reduced and if the lenses are further apart than our eyes, the 3d effect is magnified, but it also makes things look smaller than they actually are - and it appears less realistic.
As far as using the rift to view 3d images, I'm sure it can be done, but I don't know the details of programs and file formats. Getting a 3d image with a high enough FOV to justify using the Rift is non-trivial - I think GoPros with wide angle lenses have been used successfully. - geekmasterProtege
"LegacyTooling" wrote:
A 3d camera lens like the one linked is unlikely to give you a satisfactory 3d effect because the lenses are too close together.
The 3D effect depends on subject matter. Using lenses with small IPD works great for photographing scale models, making them appear life-size depending on scale factor. Scale models are cheaper to build if your budget cannot afford full-sized props, and a small-IPD camera is excellent for such purposes.
Even full-sized small scenes taken as closeups, such as photos of flowers and insects or other close-in subject matter, can provide amazing 3D with a small IPD camera.
Beware that the small IPD makes things appear larger than normal, which itself provides an interesting novel effect.
More to the point, the camera IPD does not change the quality of the perceptual "3D effect", though it does change the perceived size of the world being photographed. - racerx3Honored Guest
"geekmaster" wrote:
The 3D effect depends on subject matter. Using lenses with small IPD works great for photographing scale models, making them appear life-size depending on scale factor. Scale models are cheaper to build if your budget cannot afford full-sized props, and a small-IPD camera is excellent for such purposes.
Even full-sized small scenes taken as closeups, such as photos of flowers and insects or other close-in subject matter, can provide amazing 3D with a small IPD camera.
Beware that the small IPD makes things appear larger than normal, which itself provides an interesting novel effect.
More to the point, the camera IPD does not change the quality of the perceptual "3D effect", though it does change the perceived size of the world being photographed.
Thanks very much for the replies.
In theory, could the perceived scale be adjusted using only software? If I take a stereo image with small lenses and just 20mm spacing, could I then increase the distance with software IPD separation another ~40mm and get an accurate 3D effect and 1:1 scale? - geekmasterProtege
"racerx3" wrote:
In theory, could the perceived scale be adjusted using only software? If I take a stereo image with small lenses and just 20mm spacing, could I then increase the distance with software IPD separation another ~40mm and get an accurate 3D effect and 1:1 scale?
You would need to extract a depth map from the stereoscopic image pair, then change the virtual IPD. The problem areas in the picture are where you need to synthesize pixels to fill the background areas obstructed in the original IPD, which become visible in the new altered IPD. This can be resolved by using more cameras. A good 360-degree 3D rig can be made with 12 outward-facing cameras, where stereoscopic panoramas may be extracted from overlapping portions of different cameras.
Full high-accuracy object-digitizing image capture is often done with 64 inner-facing cameras, and there is software to create a depth map or even a full object mesh from a bunch of images like that provided by these cameras.
As we get better AI object perception algorithms, we will be able to use lots of image views to build a 3D model of the photographed subject matter, much like how our brains build a visual model from the extremely low FoV time-sequential foveated saccadic motion views from our eyes. - racerx3Honored Guest
"geekmaster" wrote:
The problem areas in the picture are where you need to synthesize pixels to fill the background areas obstructed in the original IPD, which become visible in the new altered IPD. This can be resolved by using more cameras.
OK, thanks for walking me through this. So, if I understand correctly, I've illustrated your point with the first two examples in the graphic I sent (20mm sep. / 40° lenses VS 40mm sep. / 40° lenses). The viewable face of the object is clearly larger with the 40mm separation.
It seems to me, however, that that same effect can be achieved by increasing the angle of the lens? Increasing to 45° seems to show more of the face of the object (see overlays)? - geekmasterProtege
"racerx3" wrote:
It seems to me, however, that that same effect can be achieved by increasing the angle of the lens? Increasing to 45° seems to show more of the face of the object (see overlays)?
Not really. Think about small foreground objects like branches and twigs on trees. Moving the IPD inward or outward will reveal previously-obstructed objects. You cannot easily fill-in missing details from the other camera because mid-ground and background objects also vary by parallax from the other camera viewpoint.
Some pixel synthesis algorithms merely stretch and blend border pixel over the missing details, which is better than nothing, but looks wrong when visually inspecting such areas of the 3D image set. More cameras is the solution, but moving the IPD ouside their boundary can cause the same problem on a larger scale.
Sometimes you DO want to use an extra-wide FoV, making real world scenes appear as miniature sets (a classic example is the opening scenes for the 3D version of the Jack Black "Gulliver's Travels" movie, where a real world sequence is viewed as though through the eyes of a giant). Try watching that movie in 3D to see the "eyes of a giant" 3D perspective effect I mentioned above. This "extreme parallax" effect is completely lost when viewing a monoscopic (not 3D) film.
Different IPDs have their uses, depending on artistic intent."LegacyTooling" wrote:
A 3d camera lens like the one linked is unlikely to give you a satisfactory 3d effect because the lenses are too close together. IPD is the distance between the eyes and for a realistic 3d effect, the lenses need to be approximately the same distance apart as our eyes are.
The lenses do not need to match our eyes, and a smaller camera IPD is critically important when using scale models to represent full-sized objects, as I mentioned previously. Also, for close-up macro shots, a very small IPD is very important, to prevent needing to look painfully cross-eyed at tiny objects and viewing a virtually scaled-up model at a comfortable gazing distance (with "more realistic 3D effect"). Viewing something tiny that is sitting on your nose cannot look good in 3D -- much better viewing a larger version at arm's length, from my own personal experience."LegacyTooling" wrote:
My understanding is that if the distance between the lenses is less than the distance between our eyes, the 3d effect is reduced and if the lenses are further apart than our eyes, the 3d effect is magnified, but it also makes things look smaller than they actually are - and it appears less realistic.
Quite contrarily, it can provide a totally realistic "3D effect", though it does change your perception of relative sizes of your own body and/or subject matter, as I described in recent posts.
Sorry, but in my own literal logical linguistics experience, making a blanket statement like "unlikely to give you a satisfactory 3d effect because the lenses are too close together" expresses a weak attempt at research or imaginative investigation. The world of possibility is much larger than any of us can perceive, and it can be even more restricting when we limit our research to what we already know. IMHO, of course... - racerx3Honored GuestOK, I believe I get the concept for why this won't work well for close-up objects. If using the larger separation, you're going to see further around the sides of an object you're staring directly at. That said, it seems to me that, by that logic, at some calculable distance the effect becomes negligible? If there was nothing closer than "x" inches to the lenses, could an accurate stereo image be captured? Wouldn't missing pixel information only occur for objects closer than "x"?
EDIT: Isn't this fundamentally a VR version of focal length? - geekmasterProtege
"racerx3" wrote:
OK, I believe I get the concept for why this won't work well for close-up objects. ... EDIT: Isn't this fundamentally a VR version of focal length?
Huh? I said a small camera IPD is "critically important" for close-up objects, not "won't work well". Go back and read more carefully my posts above, taking my literal literary style into account.
IPD-related parallax and focal length are two DIFFERENT ways of perceiving depth. Not the same thing. Some people who cannot perceive stereoscopic (parallax based) 3D can perceive 3D depth based on how they focus their eyes.
Or did I misinterpret what you said? :o - racerx3Honored Guest
"geekmaster" wrote:
Not really. Think about small foreground objects like branches and twigs on trees. Moving the IPD inward or outward will reveal previously-obstructed objects. You cannot easily fill-in missing details from the other camera because mid-ground and background objects also vary by parallax from the other camera viewpoint.
Sorry, I guess I misunderstood this. I interpreted that, by moving the lenses closer together, one would be able to see less mid- and background detail due to the larger size (and therefore obstruction) of the foreground object, and inversely when increasing the distance between the lenses (and assuming that the object in front was not encompassing your entire FOV), you could see more mid- and background detail and the perceived size of the object would decrease. EDIT: It's important to remember that I'm trying to achieve a 1:1 scale "feeling" with this photograph.
I was only making a comparison to focal length as in my limited knowledge of (monoscopic) photography, the correct focal length can be determined by calculating working distance between you and a subject. - racerx3Honored Guest
"racerx3" wrote:
If using the larger separation, you're going to see further around the sides of an object you're staring directly at. That said, it seems to me that, by that logic, at some calculable distance the effect becomes negligible? If there was nothing closer than "x" inches to the lenses, could an accurate stereo image be captured? Wouldn't missing pixel information only occur for objects closer than "x"?
EDIT: Isn't this fundamentally a VR version of focal length?
I'm still unclear about this after reading through this and other threads.
If no objects being photographed are closer than "x" distance to the camera, does the difference in lens separation remain noticeable in terms of scale of the objects? I get that for photographing scale models the smaller separation is desirable, but again, I want to maintain a feeling of 1:1 scale.
I guess I need to figure out a calculation by which I may determine how IPD and lens angle together affect perception of scale, and then ask whether the scale distortion, if any, can be processed on the software side. Does it sound like I'm asking the right question here?
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 year ago
- 2 years ago
- 8 months ago
- 2 years ago