Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Anonymous's avatar
Anonymous
11 years ago

Example of SDK Distortion for Generic Plugin

A lot of people are probably facing the same problem: how to integrate rift distortion into an existing game engine, for example via plug-in.

Getting hmd position and direction is easy, but doing the actual sdk distortion is pretty challenging.
It sounds easy from the steps described in the sdk, but practically speaking...
( Tuscany has pretty deep dependencies on Common Platform and Render, which are quite tough to dissect.
Tiny on the other hand is not cross-platform. )

Optimally this could be an sdk feature, to accept a texture handle and work with it, to let developers avoid the convoluted setup completely?

I wonder if anybody would be so kind as to share a minimal example in which you pass an existing texture, rendered by the game engine, to ovrHmd_EndFrame EyeTextures?

6 Replies

  • Anonymous's avatar
    Anonymous
    Thanks!

    To elaborate a little bit more, I have been digging away at creating a plugin for the lesser known ShiVa engine, which gives me a side-by-side texture, and have made progress to the point where I'm trying to call ovrHmd_EndFrame, but am for the moment befuddled by the complexity of how to pass the sbs texture handle to the EyeTextures.
  • Have you gone through the docs yet (the Developer Guide should be the one)?

    If that doesn't explain how to do this, it really should.
  • Anonymous's avatar
    Anonymous
    It's pretty clear, but all the examples are creating everything from scratch. I only have the handle of the texture passed to me from the engine, and am unsure of how to bind that to the ovrTextures.
  • This is a quite common problem when retrofitting an engine with the Oculus Rift SDK. Eskil from Quel Solaar wrote a rather upset forum post about his integration issues:
    viewtopic.php?f=20&t=19937

    And Eskil is certainly not some newbie programmer. He is a veteran OpenGL programmer, who has contributed to the development of both the OpenGL standard as well as the OpenGL Shading Language. If he has problems, you can bet that other programmers will face similar challenges or worse.

    I know it first hand with my own integration with OpenSceneGraph, were I have spent countless hours of trying to get SDK distortion rendering to work. I still have not succeed (although I have succeeded to get client distortion rendering to work).

    The problem is that the Oculus SDK makes some rather broad assumptions of the initialization order, which makes SDK rendering fail miserably in my case. In my trials to get SDK rendering to work with I have ended up with three equally failed results:
    • I hit the ASSERT inside CAPI_GL_Util.cpp in the CreateShared function(). The call to wglShareLists() fail every time.

    • The application crashes deep inside the nvidia driver.

    • The application only displays a black screen.


    And all because OpenSceneGraph uses lazy initialization of its contexts and textures. Which plays very badly with the Oculus way of initializing things.