Forum Discussion
Constellation
12 years agoAdventurer
OpenSceneGraph
I've started working on an integration with OpenSceneGraph (http://www.openscenegraph.org/) and although I don't have a Rift to test with yet what I've done looks pretty close to the Oculus World demo so I'm hoping I've done it right. If there are any OSG developers out there who could test out my code I'd really appreciate it. I found GLSL versions of the shaders in the OpenHMD project (http://openhmd.net/) which are referenced as vertexShaderSource and fragmentShaderSource. I based my work on the night vision effect example from the OpenSceneGraph Cookbook (http://www.packtpub.com/openscenegrap-3-for-advanced-3d-programming-using-api-cookbook/book) and my code uses the functions from that example to create the RTT & HUD cameras as well as the quads. The source is available at https://github.com/xarray/osgRecipes. I have pseudocode where the scene is added to the cameras (step five in the cookbook) and where the cameras are added to the root node (step 7 in the cookbook) because I'm also integrating with a derived type of renderer but you should be able to replace them with what's in the example.
// Because I don't have a Rift yet I just made a OVR::HMDInfo and OVR::Util::Render::StereoConfig and guessed at some values.
myHMDInfo.HResolution = 1280;
myHMDInfo.VResolution = 800;
myHMDInfo.HScreenSize = 0.14976;
myHMDInfo.VScreenSize = 0.0935;
myHMDInfo.VScreenCenter = myHMDInfo.VScreenSize / 2;
myHMDInfo.EyeToScreenDistance = 0.05; //????
myHMDInfo.LensSeparationDistance = 0.03; //????
myHMDInfo.InterpupillaryDistance = 0.025; //?????????
myStereoConfig.SetHMDInfo(myHMDInfo);
const int textureWidth = myHMDInfo.HResolution / 2;
const int textureHeight = myHMDInfo.VResolution;
// Setup textures for the RTT cameras
osg::ref_ptr<osg::Texture2D> leftEyeTex2D = new osg::Texture2D;
leftEyeTex2D->setTextureSize( textureWidth, textureHeight );
leftEyeTex2D->setInternalFormat( GL_RGBA );
osg::ref_ptr<osg::Texture2D> rightEyeTex2D = new osg::Texture2D;
rightEyeTex2D->setTextureSize( textureWidth, textureHeight );
rightEyeTex2D->setInternalFormat( GL_RGBA );
// Initialize RTT cameras for each eye
osg::ref_ptr<osg::Camera> leftEyeRTTCamera = osgCookBook::createRTTCamera(
osg::Camera::COLOR_BUFFER, leftEyeTex2D.get());
OVR::Util::Render::StereoEyeParams leftEyeParams =
myStereoConfig.GetEyeRenderParams(OVR::Util::Render::StereoEye_Left);
osg::Matrixf leftEyeTranslation = osg::Matrixf::translate(
leftEyeParams.ViewAdjust.M[0][3], leftEyeParams.ViewAdjust.M[2][3],
leftEyeParams.ViewAdjust.M[2][3]);
leftEyeRTTCamera->setViewMatrix(leftEyeTranslation);
leftEyeRTTCamera->setProjectionMatrixAsPerspective(
myStereoConfig.GetYFOVDegrees(), myStereoConfig.GetAspect(), 0.3, 5000);
osg::ref_ptr<osg::Camera> rightEyeRTTCamera = osgCookBook::createRTTCamera(
osg::Camera::COLOR_BUFFER, rightEyeTex2D.get());
OVR::Util::Render::StereoEyeParams rightEyeParams =
myStereoConfig.GetEyeRenderParams(OVR::Util::Render::StereoEye_Right);
osg::Matrixf rightEyeTranslation = osg::Matrixf::translate(
rightEyeParams.ViewAdjust.M[0][3], rightEyeParams.ViewAdjust.M[2][3],
rightEyeParams.ViewAdjust.M[2][3]);
rightEyeRTTCamera->setViewMatrix(rightEyeTranslation);
rightEyeRTTCamera->setProjectionMatrixAsPerspective(
myStereoConfig.GetYFOVDegrees(), myStereoConfig.GetAspect(), 0.3, 5000);
// add your scene to the camera, should be something like:
// leftEyeRTTCamera->addChild( scene.get() );
// rightEyeRTTCamera->addChild( scene.get() );
// Create HUD cameras for each eye
osg::ref_ptr<osg::Camera> leftEyeHUDCamera =
osgCookBook::createHUDCamera(0.0, 1.0, 0.0, 1.0);
osg::ref_ptr<osg::Camera> rightEyeHUDCamera =
osgCookBook::createHUDCamera(0.0, 1.0, 0.0, 1.0);
// Create quads on each camera
leftEyeHUDCamera->addChild( osgCookBook::createScreenQuad(
1.0f, 1.0f) );
leftEyeHUDCamera->setViewport(
new osg::Viewport(0, 0, myHMDInfo.HResolution / 2, myHMDInfo.VResolution));
rightEyeHUDCamera->addChild( osgCookBook::createScreenQuad(
1.0f, 1.0f) );
rightEyeHUDCamera->setViewport(
new osg::Viewport(myHMDInfo.HResolution / 2, 0, myHMDInfo.HResolution / 2, myHMDInfo.VResolution));
// Set up shaders (from the OpenHMD project)
osg::ref_ptr<osg::Program> program = new osg::Program;
program->addShader( new osg::Shader(osg::Shader::VERTEX,
vertexShaderSource) );
program->addShader( new osg::Shader(osg::Shader::FRAGMENT,
fragmentShaderSource) );
// Configure state sets for both eyes
osg::StateSet* leftEyeStateSet = leftEyeHUDCamera->getOrCreateStateSet();
leftEyeStateSet->setTextureAttributeAndModes( 0, leftEyeTex2D.get() );
leftEyeStateSet->setAttributeAndModes( program.get() );
leftEyeStateSet->addUniform( new osg::Uniform("warpTexture", 0) );
osg::StateSet* rightEyeStateSet = rightEyeHUDCamera->getOrCreateStateSet();
rightEyeStateSet->setTextureAttributeAndModes( 0, rightEyeTex2D.get() );
rightEyeStateSet->setAttributeAndModes( program.get() );
rightEyeStateSet->addUniform( new osg::Uniform("warpTexture", 0) );
// Add everything to the root, should be something like:
//osg::ref_ptr<osg::Group> root = ?;
//root->addChild( leftEyeRTTCamera.get().get() );
//root->addChild( leftEyeHUDCamera.get().get() );
//root->addChild( rightEyeRTTCamera.get().get() );
//root->addChild( rightEyeHUDCamera.get().get() );
//root->addChild( hudCamera.get() );
//root->addChild( scene.get() );
54 Replies
- roaltHonored GuestHi Jeff, Thank you for your start on using the Oculus Rift with OpenSceneGraph. On what platform do you expect it to work? Windows, Mac or Linux? I would it would only work on Windows or Mac, as the SDK is only out for those platforms?
I've no experience building OSG but happy to try. - ConstellationAdventurerI was working on Windows but I'd imagine this code would work on Linux as well (once the SDK is available). Once I get a Rift and have the Windows version working I'd like to get it working on Linux but that's probably a long way out at this point. I haven't tried OSG on a Mac but looking at the OSG web site it seems like people are doing it. Let me know how it goes!
- VrallyProtegeHi JeffBail et al,
I have made a example called OsgOculusViewer that I succesfully tested with my Rift Unit. I had to modify the shader in the SDK docs a bit and I also made a class responsible for talking to the Oculus device. This class also works if no Oculus Device is connected, then it just return the same values as if the development kit was connected. I hope I haven't made any obvious mistakes, but as always; no guarantees... ;)
Best regards
PixelMiner
Edit: Code now available at GitHub - VrallyProtegeMy code is available at the following GitHub repo:
https://github.com/bjornblissing/osgoculusviewer
The latest edition is tested against the SDK 0.2.3 version, which was released just hours ago.
Support is added for basic orientation tracking and correction for chromatic aberration. I have only tested the code on Windows 7 with Visual Studio 2010. But the latest SDK release has Linux support and I will try to test it when I get some spare time. - Anonymoushi bjoern,
great job, thanks a lot !
I just compiled it on linux(kubuntu 13.04/64bit) and with a few changes it worked. Had to extend the src/CMakeLists.txt for a few dependencies, after line 40:
IF(UNIX)
TARGET_LINK_LIBRARIES(${TARGET_TARGETNAME} OpenThreads pthread udev X11 Xinerama)
ENDIF(UNIX)
max - VrallyProtegeHi Max,
I added your changes to the GitHub repo, although I was able to compile on linux without including OpenThreads. So I removed that from your proposed change.
Best regards
Björn - roaltHonored Guesthi all,
I also have it running under linux with Oculus SDK 0.2.3 - VrallyProtegeHi,
I have cleaned up the code a bit today, mostly stylistic changes but I added an option to toggle on/off chromatic aberration correction. Latest progress is pushed to GitHub.
Best regards
Björn - VrallyProtegeLast week I fixed a couple of bugs connected to calculating the aspect ratio. I also made a simplified version of the distortion scale calculation. (If you do not want to use the default value you can supply a custom scale value instead, for example if you want to reduce the render target size.) I have also enabled sensor prediction with adjustable prediction delta.
Today I merged the changes to the composite viewer based branch as well.
The code is also successfully tested against the latest version of the Oculus SDK (0.2.4).
Best regards
Björn - AfuergHonored GuestGreat project! But I'm asking myself why you implemented all the calculations yourself, when the Oculus SDK offers the StereoConfig class which is supposed to handle this kind of calculations.
Is there a specific reason to not use it?
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 11 years ago
- 10 years ago
- 12 years ago