Forum Discussion
mastasky
12 years agoExplorer
Displaying pre-distorted webcam images
Hi all,
this topic has been discussed at MTBS3D before, but there doesn't seem to a be a conclusion yet.
The idea is to use a webcam (or 2) to capture stereoscopic images and pipe them through to the Rift. Obviously the Rift needs images with barrel distortion applied, so one way of achieving that is distorting the webcam images in software.
A conceptually easier and time (=latency) saving approach could be to use lenses in front of the webcam which have the exact inverse properties of the Oculus lenses. That would distort the pictures without any software transformation and basically result in zero lag. I realise that once the Oculus consumer version comes out, new lenses may have to be used. In that sense software transformation is easier to adapt, but I'd still like to look into the hardware solution.
Has someone experimented with the Oculus lenses or has the specs for them? Once my dev kit arrives I could get the lenses measures and purchase/produce "inverse" lenses to achieve the effect, but I was wondering if anyone already had that kind of information. My hope is that Oculus uses lenses with standard properties that can simply be purchased from an optics provider.
Cheers
this topic has been discussed at MTBS3D before, but there doesn't seem to a be a conclusion yet.
The idea is to use a webcam (or 2) to capture stereoscopic images and pipe them through to the Rift. Obviously the Rift needs images with barrel distortion applied, so one way of achieving that is distorting the webcam images in software.
A conceptually easier and time (=latency) saving approach could be to use lenses in front of the webcam which have the exact inverse properties of the Oculus lenses. That would distort the pictures without any software transformation and basically result in zero lag. I realise that once the Oculus consumer version comes out, new lenses may have to be used. In that sense software transformation is easier to adapt, but I'd still like to look into the hardware solution.
Has someone experimented with the Oculus lenses or has the specs for them? Once my dev kit arrives I could get the lenses measures and purchase/produce "inverse" lenses to achieve the effect, but I was wondering if anyone already had that kind of information. My hope is that Oculus uses lenses with standard properties that can simply be purchased from an optics provider.
Cheers
8 Replies
- JustinHonored GuestWouldn't it be easier and cheaper to just distort the images with software? The work has already been done. The only real advantage you would get from distorting the image with a lens is an unnoticeable decrease in latency.
- mastaskyExplorerCurrently exploring both options. A lens is a very low tech approach and we could then just pipe the images through, so I think it's worth exploring.
- KuraIthysHonored Guest
"Justin" wrote:
Wouldn't it be easier and cheaper to just distort the images with software? The work has already been done. The only real advantage you would get from distorting the image with a lens is an unnoticeable decrease in latency.
I wouldn't presume that 0 latency would be a bad thing...
The image distortions might be quick, but a lens is instant. (the distortion it produces is related to the speed of light in whatever it's made of. But that's still on the order of millionths of a second).
If the warping code adds 1 ms or less of latency it might not matter, admittedly, but given current latency is 25-55 ms or more best-case, anything that can reduce it would definitely help. - lazydodoHonored GuestI'd say measure how long the warping shader takes , no need to sink a pile of time and money in custom lenses if you're saving your self a only few microseconds. If it run in the milliseconds sure it's worth a shot but i'd be really surprised if that was the case.
- KuraIthysHonored Guest
"lazydodo" wrote:
I'd say measure how long the warping shader takes , no need to sink a pile of time and money in custom lenses if you're saving your self a only few microseconds. If it run in the milliseconds sure it's worth a shot but i'd be really surprised if that was the case.
Well, you might say that, but on my (slow) laptop, turning on the image warp caused a drop of 2-3 fps (out of something that was about 15-21 to start with.)
That means compared to running without warp, 20 fps became about 17.
20 fps = 50 milliseconds per frame
17 fps = 58.8 milliseconds per frame.
My system is way too slow to be representative, but an extra 8 milliseconds of rendering time (or about 15%) is pretty big.
Now, rendering time isn't equal to latency, but unless your rendering engine has some creative workarounds in it, the overall latency consists of:
input hardware latency + input processing latency + Rendering latency + Screen output latency.
So... an extra 15% rendering time is a definite, and substantial increase in latency.
(of course, if other hardware is more efficient at calculating the warping it might be less than that, and also if you render at 60 fps, it would imply you could render at 69 fps without the warping, but also that the warp process itself adds about 2.5 milliseconds to the rendering of each frame.)
So... It would seem the image warp (unless it's vastly more efficient on newer hardware), would only start to become insignificant once you reach about 240 fps or so... But at those kinds of speeds the entire rendering process itself starts to look like an insignificant fraction of overall latency.) - willsmithHonored GuestI'm working on a telepresence project for the Oculus, and the use of a lens mount that splits the image and applies the barrel distortion would be ideal. If I'm taking two separate video feeds, applying barrel distortion, and then combining them into a single feed, and transmit that to the Oculus wirelessly, it's going to add a bunch of latency.
If you hear anything mastasky, please keep us updated. - cyberealityGrand ChampionGetting special lenses (ie fisheye) could be a good solution to this problem. You can find them relatively cheap (the ones for phones) and should get you half-way there. I mean, it will be difficult to get it to exactly match the Rift, but it may be close enough to be useful.
- geekmasterProtegeYou may be able to hack a pair of fisheye lenses into a stereoscopic fisheye lens as described here:
http://www.mtbs3d.com/phpBB/viewtopic.p ... 66#p103869
This is the result of the proposed lens hack:
This proposed lens hack is modeled after an off-the-shelf commercial stereoscopic lens:
You do not want the lens centers farther apart than your IPD (after expanding the full video output image to 6-inches wide for the Rift).
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 12 years ago
- 2 months ago