Forum Discussion
kojack
12 years agoMVP
Ogre3D Integration
I grabbed the Oculus sdk earlier today and have been working on a sample of using it with the open source 3d engine Ogre3D.
I've got the rendering working fairly well and I wrote a parser to load the tuscany scene from the sdk. Head tracking is almost implemented.
Since I don't have an actual oculus rift, I'll give out a test version soon to see if it works for anyone else.
http://www.ogre3d.org/forums/viewtopic.php?f=5&t=76970

Edited in Latest Status:
06/04/2013
Source (MIT licensed) released at https://bitbucket.org/rajetic/ogreoculus
08/04/2013
Latest binary demo: https://bitbucket.org/rajetic/ogreoculus/downloads/OgreOculusDemo1_v0.5.1.7z
I've got the rendering working fairly well and I wrote a parser to load the tuscany scene from the sdk. Head tracking is almost implemented.
Since I don't have an actual oculus rift, I'll give out a test version soon to see if it works for anyone else.
http://www.ogre3d.org/forums/viewtopic.php?f=5&t=76970
Edited in Latest Status:
06/04/2013
Source (MIT licensed) released at https://bitbucket.org/rajetic/ogreoculus
08/04/2013
Latest binary demo: https://bitbucket.org/rajetic/ogreoculus/downloads/OgreOculusDemo1_v0.5.1.7z
38 Replies
- MarbasHonored GuestWow! Ogree3D integration. Looking forward to try this.
- ganzuulHonored GuestSweet! Hope you get your rift soon. =)
- Binary time.
First up, this is just a testing demo to see if the headset tracking and distortion are working on the real device. I removed the tuscany scene, it looks cool but even zipped it took up over 60MB. So this one is just a couple of sample meshes that come with ogre.
I'll get something much better looking (and better controls) later.
There's no real purpose to this demo if you don't have an Oculus rift yet. It will run, but it's pretty boring (no physics, not much to look at).
OgreOculus V0.2 (7MB)
http://www.mediafire.com/?avunf95hf78yk37 Edit: Added 2 april 2012
Controls:
wasd - movement
1 - walking mode (clamped to the ground)
2 - flying mode
r - reset pitch of your body
escape - exit
mouse - look around
There's two orientations in the demo, your body and your head. Body is controlled by the mouse. Head is controlled by the oculus rift (if it's detected). They combine together for the actual view you see. Movement is relative to your body orientation only, so looking around with just your head won't change the direction you are moving.
Both OpenGL and DirectX9 are available in this demo. Either will work, but I'd recommend DirectX9 because in Ogre it gives you the ability to specify the monitor to use, while OpenGL mode starts where ever it wants to.
The monitor is chosen (assuming you are using DirectX9) in the Ogre config dialog that appears when you start the demo. Look for the "Rendering Device" option. It gives you a list of display outputs. One of them should be your Oculus Rift. You can also set vsync on or off here (on is recommended), enable fullscreen (also recommended) and change the screen res (1280x800 is the native Oculus res of the devkit).
If anything goes wrong (mainly oculus based, I'm not trying to debug ogre itself right now), please post (or send me) your ogre.log file.
- tbowrenHonored GuestThank you for posting this. I was very excited to see someone else working in Ogre. I was able to hook up head tracking this weekend but have not tried to setup the warping. I got it working but only after realizing you cant put the GetOrientation() in the frameRenderingQueued() function like the mouse and keyboard capture calls are. Instead I had to put it in a main loop and then call renderOneFrame() after I getting latest sensor data.
Unfortunately when I run your program I see 2 Sinbads, standing side by side. I tried it in fullscreen and windowed at 1200 x 800. Are you doing the added Frustum transform to accommodate for the fact that the camera interest is not in the center of the viewports on the 7 inch displays? - Did the head tracking in my one work?
Yep, I'm probably not doing the center of the viewports correctly. I'll do some playing around. - tbowrenHonored GuestYes the tracking worked very well. Interestingly, when I walk all the way up to Sinbad until the camera collides, he becomes in perfect "registration". It actually looks very cool. His beard sticks right in your face. Very 3D! :)
Not sure if that helps, could just be coincidence that the camera / viewports happen to display correctly at that distance. - Cool. :)
I took a closer look at the output of the tuscany demo, I can see now how each eye is offset a bit. I've got the lens centre variable in my shader but I just gave it 0.5,0.5 for each eye. I'll see if I can calculate it (the sdk was a bit confusing around that bit, plus ogre's viewports have different coordinates to what the sdk uses), plus add a runtime control of it for testing. - New release version 0.2:
http://www.mediafire.com/?avunf95hf78yk37
I've added in the lens centre calculations. It looks like it might be right, but I'll need somebody to test for me again. :)
If the value isn't right, you can now press 3 and 4 to move the value around. It writes the current value to the ogre.log as you change it (so if it's wrong, let me know what's in the log).
I also disabled camera pitch control using the mouse (vertical movement) if a headset is detected. You can still yaw with the mouse at any time (with or without headset). - tbowrenHonored GuestOk, so I got a chance to test this. First thing I noticed was that my tracking stopped working. That was because your old app didn't cleanly shut down and left the process with hooks to the sensor in memory, so you might want to look at that. Once I killed the task it was fine.
I was able to use the "3" and "4" keys to get Sinbad lined up, but at the expense of messing up all the registration on the objects on sides of the frame. It was interesting that once he was lined up in the center, he stayed lined up no matter how far back I went.
The sdk talks about keeping the eyes looking parallel into the screen, and to adjust the projection frustum for the display. It says the cameras in scene just need to be offset in translation for each eye but remain parallel. This is probably the only way to make sure object near and far line up correctly. The fact that he stayed lined up probably means your cameras are looking in the right direction. Perhaps some other value needs to adjust as the IPD changes?
The ogre log says: 09:11:19: Oculus: Lens Centre Offset = -0.11525 - Thanks for trying it out.
First thing I noticed was that my tracking stopped working. That was because your old app didn't cleanly shut down and left the process with hooks to the sensor in memory, so you might want to look at that. Once I killed the task it was fine.
Hmm, so it was still running in the background but had no window? Weird. I'll see if I can find what would cause that.
I think I've got the rendering fixed now.
I wasn't offsetting the projection matrices.
Here's a comparison shot. Top is the version 0.2 and bottom is version 0.3.
And here's the latest release. Hopefully third time lucky (especially since I'm going to be busy for the rest of the week with work).
Version 0.3: http://www.mediafire.com/?8f2cjp9r2bk22yj
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 6 years ago
- 6 years ago