Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
trippedout's avatar
trippedout
Honored Guest
12 years ago

Cinder Integration w/ OpenGL

So all the Samples that ship for windows with the 4.1 SDK are using D3D for their textures and rendering - but the official docs have both D3D samples (same as in sample projects) and OpenGL sample code, ie :

// Configure OpenGL.
ovrGLConfig cfg;
cfg.OGL.Header.API = ovrRenderAPI_OpenGL;
cfg.OGL.Header.RTSize = Sizei(hmd->Resolution.w, hmd->Resolution.h);

etc etc

so, this seemed pretty straightforward to follow along, especially since Cinder does everything behind the scenes in OpenGL, and I was able to set up my config properly, but when i finally finish and pass my objects to ovrHmd_EndFrame i start getting a lot of errors.

ovrGLTexture EyeTextures[2];
//setup etc
ovrHmd_EndFrame(mHmd, headPose, EyeTexture);

Is this to be expected? are there any full samples using OpenGL anywhere? most of my errors come from what seems like the first thing it tries in CAPI_GL_HSWDisplay.cpp -> HSWDisplay::RenderInternal(ovrEyeType eye, const ovrTexture* eyeTexture)

ovrGLTexture* eyeTextureGL = const_cast<ovrGLTexture*>(reinterpret_cast<const ovrGLTexture*>(eyeTexture));

but when i look at ovrGLTexture and ovrTexture it looks like this cast is bound to fail... or am i missing something?

I'll give D3D another shot tomorrow but since i know absolutely nothing about it, it'll probably be another long day. Hope you can help!

Thanks
Anthony

6 Replies

  • The casting of ovrGLTexture in to ovrTexture works because ovrGLTexture is a union, not a structure, of ovrTexture and ovrGLTextureData. The ovrTextureHeader struct in both of these is the first member and so occupies the same space, so the parts of the SDK that are abstracted away from the render API can still use the header correctly.

    I don't have access to my code at work but my setup code is pretty much in my first post here:

    viewtopic.php?f=17&t=13256

    If that doesn't help, post some more of your setup code and I'll see if I can spot anything wrong!
  • "Rajveer" wrote:
    The casting of ovrGLTexture in to ovrTexture works because ovrGLTexture is a union, not a structure, of ovrTexture and ovrGLTextureData. The ovrTextureHeader struct in both of these is the first member and so occupies the same space, so the parts of the SDK that are abstracted away from the render API can still use the header correctly.

    I don't have access to my code at work but my setup code is pretty much in my first post here:

    viewtopic.php?f=17&t=13256

    If that doesn't help, post some more of your setup code and I'll see if I can spot anything wrong!


    thanks dude - i realized that the errors i were getting were not because of HSW (health safety window) but because my initialization stuff was off. now i can render to the device using opengl and some basic code that i found here: http://pastebin.com/mbGRQuXY - stripping out the SDL and just using the opengl calls they have. now my issue is that i had to dig deep into the cinder source and make sure all my frame timing shit was thrown out and draw wasn't called since the while loop handles drawing on its own (prior to disabling the double calls were a flickering mess, it was astonishingly nauseating)

    either way, the only reason i was trying to do this was because my video was tearing alot when i was playing back in VRPlayer, so i figured "if i handle this myself and send it to rift directly i should be able to overcome that". but now that i have a triangle rendering, same shit, the vysnc or something cant keep up, where as the D3D11 textures in the two demos dont tear at all, so im thinking i need to go and figure out how to create a texture that way?

    id LOVE to know if anyone at oculus has a working OpenGL demo that doesnt have tear/vsync issues, so i could check out the code and see where i'm wrong. the fidelity of the samples in d3d compared to basic opengl is pretty shit.

    any thoughts?
  • svenito's avatar
    svenito
    Honored Guest
    I hope you don't mind me posting in this thread with a problem I am having with Cinder and the DK2.

    I've managed to get the stereo rendering to a framebuffer working fine. And I can render to the Oculus if set to extend mode, even if the view is inverted (another problem for another time, but a quick fix for this would be welcome).

    What I want to get working is direct rendering. I've followed some OpenGL code that managed to get this working, but I am having no luck. As I said, without the call to ovrHmd_ConfigureRendering and then omitting the ovrHmd_BeginFrame and ovrHmd_EndFrame, the fbo renders out fine in the window.

    Once call the mentioned functions I get an oculus distorted view with the HSW and only the fill colour, no geo or anything else from the render function. The oculus itself is also black, but the position and motion tracking works (I can see the HSW move slightly)

    I find the docs are quite good, but they kind of skim over the OpenGL code and there's no examples provided, like there is for D3D.

    My code is here, using DK2 SDK 0.4.2 and the current Cinder on Windows:

    http://pastebin.com/kwtDbhSa

    Any pointers appreciated. Thanks