Forum Discussion
Harley
12 years agoHonored Guest
Raspberry Pi camera module - Could it be modded for Oculus?
Raspberry Pi project's camera module is on sale from today, could they be modded cheaply for the Oculus Rift input?
http://www.raspberrypi.org/archives/3890
Use two for stereoscopic 3D, and just mod cameras with cheap 'fish-eye lens' to get a FOV closer to that of the Rift.
I guess you would still also require a separate microcontroller with digital camera interface for USB connection to your PC (without connecting a full Raspberry Pi to use it as the micro controller) if you wanted mod it to mount them directly on or inside the Oculus Rift with as low latency as possible for good augmented reality use?
These are otherwise tiny sized modules, very now price at around $25, and great image quality and high frame rate.
Five megapixel fixed-focus shooter, the module measure in at 25 x 20 x 9mm, (the actual sensor itself is only 8x8x5mm), can snap 2,592 x 1,944-pixel images, and capture video at 1,080p (30fps), 720p (60fps), and VGA (60 or 90fps).
http://www.raspberrypi.org/wp-content/uploads/2013/02/camerafront.jpg
http://www.raspberrypi.org/wp-content/uploads/2013/02/cameraback.jpg
You can buy this camera module from RS Components or from Premier Farnell/Element14:
http://uk.rs-online.com/web/generalDisplay.html?id=raspberrypi
http://www.element14.com/community/groups/raspberry-pi
http://www.raspberrypi.org/archives/3890
Use two for stereoscopic 3D, and just mod cameras with cheap 'fish-eye lens' to get a FOV closer to that of the Rift.
I guess you would still also require a separate microcontroller with digital camera interface for USB connection to your PC (without connecting a full Raspberry Pi to use it as the micro controller) if you wanted mod it to mount them directly on or inside the Oculus Rift with as low latency as possible for good augmented reality use?
These are otherwise tiny sized modules, very now price at around $25, and great image quality and high frame rate.
Five megapixel fixed-focus shooter, the module measure in at 25 x 20 x 9mm, (the actual sensor itself is only 8x8x5mm), can snap 2,592 x 1,944-pixel images, and capture video at 1,080p (30fps), 720p (60fps), and VGA (60 or 90fps).
http://www.raspberrypi.org/wp-content/uploads/2013/02/camerafront.jpg
http://www.raspberrypi.org/wp-content/uploads/2013/02/cameraback.jpg
You can buy this camera module from RS Components or from Premier Farnell/Element14:
http://uk.rs-online.com/web/generalDisplay.html?id=raspberrypi
http://www.element14.com/community/groups/raspberry-pi
14 Replies
- edziebaHonored Guest
"Harley" wrote:
Why NOT use the Pi? It can stream the video over a network connection to the PC.
I guess you would still require a separate microcontroller with digital camera interface for USB connection to your PC (without connecting a full Raspberry Pi to use it as the micro controller)
When my cameras arrive, I've got two projects lined up: outside-in marker tracking (with the Pi's doing the blob centre locating and just streaming coordinate data to a PC for processing) and gaze-tracking, assuming the camera's FFC is manipulable enough.
Of course, the module is currently firmware limited to 30fps. The foundation is working on getting 720p60 and 640x480 90fps working, but the driver is closed-source so can only be worked on internally to Broadcom. - HarleyHonored Guest
"edzieba" wrote:
"Harley" wrote:
I guess you would still require a separate microcontroller with digital camera interface for USB connection to your PC (without connecting a full Raspberry Pi to use it as the micro controller)
Why NOT use the Pi? It can stream the video over a network connection to the PC.
Because of latency, which is very important for Augmented Reality. That is unless you connect your Oculus Rift directly to the Raspberry Pi via HDMI of course, which would probably work fine for simple Augmented Reality and be good mobile prototyping, but would not work for more advanced Augmented Reality application and games, and absolutely not be capable of Augmented Reality on-top of high resolution and high frame rate game play.
I just can't imagine that using a Raspberry Pi as the microcontroller to capture the video data and sending it as a stream over the network would have lower latency than using a USB attached microcrontroller for the camera to connect it directly to the PC which in turn have the Oculus Rift connected to it via HDMI / DVI."edzieba" wrote:
Of course, the module is currently firmware limited to 30fps. The foundation is working on getting 720p60 and 640x480 90fps working, but the driver is closed-source so can only be worked on internally to Broadcom.
Broadcom is known to be one of the more open source friendly companies out there, and they have previously released many drivers, firmwares, and libraries under open source libraries.
I am sure that with enough 'pestering' from the Raspberry Pi community they could be convinced to release all code for these camera modules too. - HarleyHonored GuestInteresting article:
The Tantalizing Possibilities of an Oculus Rift Mounted Camera
http://www.roadtovr.com/2013/05/14/oculus-rift-camera-mod-lets-you-bring-the-outside-world-in-5819 - HarleyHonored GuestCamera mounted on Oculus Rift would be cool to use as an intuitive input method via hand-tracking to click on HUD.
It would however it will require the computing power of a fast GPU in modern PC, where Raspberry Pi is lacking too.
OpenNI (standard framework for 3D sensing) API and 3D hand tracking library and framework uses OpenCL or CUDA:
http://www.openni.org/files/3d-hand-tracking-library
http://cvrlcode.ics.forth.gr/handtracking
zigfu development kit for Unity 3D works with OpenNI:
http://www.openni.org/files/zdk-for-unity3d
or you could just use Unity 3D WebCam Textures script to get textures onto which the live video input is rendered.
http://docs.unity3d.com/Documentation/ScriptReference/WebCamTexture.html
http://www.youtube.com/watch?v=N3ffgj1bBGw
http://www.youtube.com/watch?v=kK0BQjItqgw - ZeRaxHonored GuestHand-tracking with Oculus Rift. :D
This would free the hands of a pair of hapticgloves like this, being able to touch, feel in VR will be the ultimate experience.
http://www.neurovr.org/neurovr2/index.php?option=com_content&task=view&id=35&Itemid=46
"Harley" wrote:
OpenNI (standard framework for 3D sensing) API and 3D hand tracking library and framework uses OpenCL or CUDA:
http://www.openni.org/files/3d-hand-tracking-library
http://cvrlcode.ics.forth.gr/handtracking
zigfu development kit for Unity 3D works with OpenNI:
http://www.openni.org/files/zdk-for-unity3d
or you could just use Unity 3D WebCam Textures script to get textures onto which the live video input is rendered.
http://docs.unity3d.com/Documentation/ScriptReference/WebCamTexture.html - edziebaHonored Guest
"Harley" wrote:
This isn't a webcam, it's a MIPI CSI camera. The bandwidth of the raw pixel stream is an order of magnitude more than what USB could ever hope to carry. You can encode it to h.264 before transmission, though no microcontroller or embedded processor will be grunty enough to do so without a dedicated encoder block. Luckily, the Pi's ARM has just such a dedicated encoder. Have the Pi handle the encoding of the raw video stream, then send this to a PC over ethernet instead of USB. That's not going to add any more latency than encoding on-board and sending it over USB (which you'd have to bodge on, as the Pi can only act as a USB host)."edzieba" wrote:
"Harley" wrote:
I guess you would still require a separate microcontroller with digital camera interface for USB connection to your PC (without connecting a full Raspberry Pi to use it as the micro controller)
Why NOT use the Pi? It can stream the video over a network connection to the PC.
Because of latency, which is very important for Augmented Reality. I just can't imagine that using a Raspberry Pi as the microcontroller to capture the video data and sending it as a stream over the network would have lower latency than using a USB attached microcrontroller for the camera to connect it directly to the PC which in turn have the Oculus Rift connected to it via HDMI / DVI.
As to whether displaying the video on-board the Pi will have a lower latency, theoretically yes. But the Pi can only handle a single camera, so you'd be stuck with mono video, and performing the warp shader with the ARM (or the ARM's GPU) might take significantly longer than if you did so on a desktop.Broadcom is known to be one of the more open source friendly companies out there, and they have previously released many drivers, firmwares, and libraries under open source libraries.
Ideally, but Broadcom have so far refused to open source anything that operates on the GPU side of things. They've been pretty adamant in not allowing anyone enough access to write their own CSI drivers either. This is why using other camera modules with the Pi is vanishingly unlikely, and similarly why there aren't more barebones camera modules available for a similar cost.
I am sure that with enough 'pestering' from the Raspberry Pi community they could be convinced to release all code for these camera modules too. - hilderonnyHonored GuestWhat about using a 3D smartphone like the HTC Evo 3D or LG P920 Optimus 3D? They both are(were) Android phones which can stream a stereo camera preview to a PC.
Have a look at here (sorry, it is a german website: http://www.htcinside.de/htc-evo-3d/ ) - nomadHonored Guesthi all,
this is my first visit...
i think its possible,
in the moment i have a raspberry pi "wheezy" revision 2 board:
i know that the recources of the raspi is small but for prototyping it's ok.
on this system i can running
- wiimote (python)
- opengl 3d-stuff
- opencv for face-detections (color-blob) with a usb-webcam
- Code Mercenaries JoyWarrior24 Force 8 3d-gyroscope with a usb-connections to my linux box
- controlling via-sdl-joystick, its a verry simple interface
now i waiting on some hardware devices....
in the meantime i make some opengl-developments
excuse my bad english
regards nomad - JstsqzdHonored GuestI'm not really a software guy, but heres my idea, tactical vest with some lightweight batteries & a Pi, then couldn't you just pass through the 2 video streams with a simple distortion filter, would not require head tracking or heaving processing power. Then overlay a nice looking HUD
For espionage quality night vision, integrate infrared filters & a Bright IR flashlight.
Getting more complex, put optical (or even just digital) zoom on the cameras for sweet espionage missions.
Throw in walkie talkie integration and I may just have to go back to being 10 years old again!
no joke if anybody wants to work on a prototype i have mad fabrication skills... - haberkernHonored GuestAdd the Hydra Razer and a quadrotor and you have my project ;) Good luck, let me know how it goes.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 10 months ago
- 2 years ago