Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
whitehexagon's avatar
whitehexagon
Explorer
11 years ago

JOVR with JOGL and SDK rendering

I'm trying to get something really basic running with OSX, JOVR, OpenGL, JOGL, and SDK 0-4-1 rendering. But I'm getting this fatal error at step 5.5 in the code below.

C  [libGL.dylib+0x18b5]  glGetString+0xf
C [jna8916088414761583343.tmp+0x14282] OVR::CAPI::GL::DistortionRenderer::GraphicsState::GraphicsState()+0x52
C [jna8916088414761583343.tmp+0x12705] OVR::CAPI::GL::DistortionRenderer::GraphicsState::GraphicsState()+0x15
C [jna8916088414761583343.tmp+0x12367] OVR::CAPI::GL::DistortionRenderer::Initialize(ovrRenderAPIConfig_ const*, unsigned int)+0x57
C [jna8916088414761583343.tmp+0x86a2] OVR::CAPI::HMDState::ConfigureRendering(ovrEyeRenderDesc_*, ovrFovPort_ const*, ovrRenderAPIConfig_ const*, unsigned int)+0x7c2
C [jna8916088414761583343.tmp+0xeea9] ovrHmd_ConfigureRendering+0x59


If I comment out step 5, then I get a different fatal error:

C  [jna3689317543873940567.tmp+0x3ba4]  _ZN3OVR4CAPI18DistortionRenderer19SetLatencyTestColorEPh+0x54
C [jna3689317543873940567.tmp+0xf301] ovrHmd_EndFrame+0xe1


Could the second one be related to using two different texture objects. I know for the renderPose I had to do a manual copy into a contiguous array to avoid some JNA error, I'm wondering if the eyeTexture parameter is expecting something similar, although no idea how I would generate that.

I would be grateful if someone can test this on windows to make sure it's not something OSX specific. Unless someone has an idea where I'm going wrong? I havent put any of the tracking/FOV/viewport opengl stuff in yet until I get this base stuff working.

Code has dependencies on jogl-all.jar, gluegen-rt.jar, jna-4.1.0.jar, jovr-0.4.1.2.jar and the relevant natives for JOGL that load at runtime: gluegen-rt-natives-macosx-universal.jar, jogl-all-natives-macosx-universal.jar

import static com.oculusvr.capi.OvrLibrary.ovrEyeType.ovrEye_Count;
import static com.oculusvr.capi.OvrLibrary.ovrTrackingCaps.ovrTrackingCap_Orientation;

import javax.media.opengl.GL;
import javax.media.opengl.GL2;
import javax.media.opengl.GLAutoDrawable;
import javax.media.opengl.GLCapabilities;
import javax.media.opengl.GLDrawableFactory;
import javax.media.opengl.GLEventListener;
import javax.media.opengl.GLFBODrawable;
import javax.media.opengl.GLOffscreenAutoDrawable;
import javax.media.opengl.GLProfile;
import javax.media.opengl.fixedfunc.GLLightingFunc;
import javax.media.opengl.fixedfunc.GLMatrixFunc;
import javax.media.opengl.glu.GLU;

import com.jogamp.newt.Display;
import com.jogamp.newt.NewtFactory;
import com.jogamp.newt.Screen;
import com.jogamp.newt.Window;
import com.jogamp.newt.event.KeyEvent;
import com.jogamp.newt.event.KeyListener;
import com.jogamp.newt.opengl.GLWindow;
import com.jogamp.opengl.FBObject.TextureAttachment;
import com.jogamp.opengl.util.Animator;
import com.oculusvr.capi.EyeRenderDesc;
import com.oculusvr.capi.FovPort;
import com.oculusvr.capi.Hmd;
import com.oculusvr.capi.OvrLibrary;
import com.oculusvr.capi.OvrRecti;
import com.oculusvr.capi.OvrSizei;
import com.oculusvr.capi.OvrVector2i;
import com.oculusvr.capi.Posef;
import com.oculusvr.capi.RenderAPIConfig;
import com.oculusvr.capi.RenderAPIConfigHeader;
import com.oculusvr.capi.Texture;
import com.oculusvr.capi.TextureHeader;
import com.sun.jna.Pointer;

public class TinyBox implements KeyListener {
//JOGL
//private WHAnimator animator;
private Animator animator;
private GLWindow glWindow;

//Rift Specific
private Hmd hmd;
Posef eyeRenderPose[];
Texture eyeTexture[];
private int frameCount;
private OvrRecti[] eyeRenderViewport;


private final class OffScreenEventListener implements GLEventListener {
private final int eye;

public OffScreenEventListener(final int eye) {
this.eye = eye;
}

@Override
public void init(GLAutoDrawable drawable) {
final GL2 gl = drawable.getGL().getGL2();
gl.glClearColor(0.5f, 0.0f, 0.0f, 0.0f);

final float lightPos[] = { 5.0f, 5.0f, 10.0f, 0.0f };
gl.glLightfv(GLLightingFunc.GL_LIGHT0, GLLightingFunc.GL_POSITION, lightPos, 0);
gl.glEnable(GL.GL_CULL_FACE);
gl.glEnable(GLLightingFunc.GL_LIGHTING);
gl.glEnable(GLLightingFunc.GL_LIGHT0);
gl.glEnable(GL.GL_DEPTH_TEST);
gl.glEnable(GLLightingFunc.GL_NORMALIZE);


final GLFBODrawable fboDrawable = (GLFBODrawable)drawable.getDelegatedDrawable();
final TextureAttachment texAttach = fboDrawable.getColorbuffer(GL.GL_FRONT).getTextureAttachment();
int eyeTextureId = texAttach.getName();
System.out.println("eyeTextureId="+eyeTextureId);
eyeTexture[eye].TextureId = eyeTextureId;
}

@Override
public void dispose(GLAutoDrawable drawable) {
// TODO Auto-generated method stub

}

@Override
public void display(GLAutoDrawable drawable) {
GL2 gl2 = drawable.getGL().getGL2();
hmd.beginFrameTiming(++frameCount);

// Clear screen
gl2.glBindFramebuffer(GL2.GL_FRAMEBUFFER, 0);
gl2.glDrawBuffer(GL2.GL_BACK);
gl2.glViewport(0, 0, glWindow.getWidth(), glWindow.getHeight());
gl2.glDisable(GL2.GL_DEPTH_TEST);
gl2.glClear(GL2.GL_COLOR_BUFFER_BIT);


gl2.glEnable(GL2.GL_DEPTH_TEST);
gl2.glEnable(GL2.GL_CULL_FACE);
gl2.glFrontFace(GL2.GL_CW);

gl2.glLineWidth(3.0f);
gl2.glEnable(GL2.GL_LINE_SMOOTH);
gl2.glEnable(GL2.GL_BLEND);
gl2.glBlendFunc(GL2.GL_SRC_ALPHA, GL2.GL_ONE_MINUS_SRC_ALPHA);

eyeRenderPose = (Posef[])new Posef().toArray(2);
for (int eyeIndex = 0; eyeIndex < ovrEye_Count; eyeIndex++) {
int eye = hmd.EyeRenderOrder[eyeIndex];
eyeRenderPose[eye].Position = hmd.getEyePose(eye).Position;
eyeRenderPose[eye].Orientation = hmd.getEyePose(eye).Orientation;

gl2.glViewport(eyeRenderViewport[eye].Pos.x, eyeRenderViewport[eye].Pos.y, eyeRenderViewport[eye].Size.w, eyeRenderViewport[eye].Size.h);

renderScene(gl2);

}
gl2.glDisable(GL2.GL_DEPTH_TEST);
gl2.glBindFramebuffer(GL2.GL_FRAMEBUFFER, 0);
gl2.glDrawBuffer(GL2.GL_BACK);
gl2.glViewport(0, 0, glWindow.getWidth(), glWindow.getHeight());
gl2.glDisable(GL2.GL_DEPTH_TEST);
gl2.glClear(GL2.GL_COLOR_BUFFER_BIT);

hmd.endFrame(eyeRenderPose, eyeTexture);
}

@Override
public void reshape(GLAutoDrawable drawable, int x, int y, int width, int height) {
GL2 gl2 = drawable.getGL().getGL2();
gl2.glMatrixMode(GLMatrixFunc.GL_PROJECTION);
gl2.glLoadIdentity();
GLU glu = new GLU();
glu.gluPerspective(45.0f, ((float) glWindow.getWidth() / (float) glWindow.getHeight()), 0.1f, 10000.0f);

// gl2.glFrustum(-4f, 4f, -2f, 2f, 5.0f, 100.0f);

gl2.glMatrixMode(GLMatrixFunc.GL_MODELVIEW);
gl2.glLoadIdentity();
gl2.glTranslatef(0.0f, 0.0f, -40.0f);

gl2.glShadeModel(GL2.GL_SMOOTH);
gl2.glEnable(GL2.GL_DEPTH_TEST);
gl2.glDepthFunc(GL2.GL_LEQUAL);
gl2.glHint(GL2.GL_PERSPECTIVE_CORRECTION_HINT, GL2.GL_NICEST);
}
} //end inner class

public void run() {
frameCount = -1;

//step 1
Hmd.initialize();
try {
Thread.sleep(400);
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}

//step 2
hmd = Hmd.create(0); //assume 1 device at index 0
if (hmd == null) {
System.out.println("null hmd");
hmd = Hmd.createDebug(OvrLibrary.ovrHmdType.ovrHmd_DK2);
}

//step 3
OvrSizei resolution = hmd.Resolution;
System.out.println("resolution= "+resolution.w+"x"+resolution.h);

OvrSizei recommendedTex0Size = hmd.getFovTextureSize(OvrLibrary.ovrEyeType.ovrEye_Left, hmd.DefaultEyeFov[0], 1.0f);
OvrSizei recommendedTex1Size = hmd.getFovTextureSize(OvrLibrary.ovrEyeType.ovrEye_Right, hmd.DefaultEyeFov[1], 1.0f);
System.out.println("left= "+recommendedTex0Size.w+"x"+recommendedTex0Size.h);
System.out.println("right= "+recommendedTex1Size.w+"x"+recommendedTex1Size.h);
int displayW = recommendedTex0Size.w + recommendedTex1Size.w;
int displayH = Math.max(recommendedTex0Size.h, recommendedTex1Size.h);
OvrSizei renderTargetEyeSize = new OvrSizei(displayW / 2, displayH); //size of single eye

eyeRenderViewport = new OvrRecti[]{new OvrRecti(), new OvrRecti()};
eyeRenderViewport[0].Pos = new OvrVector2i(0, 0);
eyeRenderViewport[0].Size = renderTargetEyeSize;
eyeRenderViewport[1].Pos = eyeRenderViewport[0].Pos;
eyeRenderViewport[1].Size = renderTargetEyeSize;

//step 4
if (hmd.configureTracking(ovrTrackingCap_Orientation, 0) == 0) {
throw new IllegalStateException("Unable to start the sensor");
}

//step 5 - configure rendering
System.out.println("step 5");
RenderAPIConfigHeader configHeader = new RenderAPIConfigHeader(resolution, 1);
RenderAPIConfig apiConfig = new RenderAPIConfig(configHeader, new int[15]);
int distortionCaps = OvrLibrary.ovrHmdCaps.ovrHmdCap_LowPersistence | OvrLibrary.ovrHmdCaps.ovrHmdCap_DynamicPrediction;
//FovPort[] eyeFov = (FovPort[])new FovPort().toArray(2);
FovPort[] eyeFov = new FovPort[]{hmd.DefaultEyeFov[0], hmd.DefaultEyeFov[1]};
EyeRenderDesc eyeRenderDescs[] = (EyeRenderDesc[])new EyeRenderDesc().toArray(2);
System.out.println("step 5.5");
if (0 == OvrLibrary.INSTANCE.ovrHmd_ConfigureRendering(hmd, apiConfig, distortionCaps, eyeFov, eyeRenderDescs)) {
throw new IllegalStateException("step 5 - cant configure");
}
System.out.println("step 5.6");
hmd.setEnabledCaps(distortionCaps); //need this or does the above do this, need above?

//step 6 - opengl window
System.out.println("step 6");
//Display.dumpDisplayList(""); //only gives: DisplayList[] entries: 0 - main
final Display display = NewtFactory.createDisplay(null);
final Screen screen = NewtFactory.createScreen(display, 0);
GLProfile glProfile = GLProfile.get(GLProfile.GL2);
System.out.println("got: " + glProfile.getImplName());
final Window window = NewtFactory.createWindow(screen, new GLCapabilities(glProfile));
window.setSize(displayW, displayH);
glWindow = GLWindow.create(window);
//glWindow.setAutoSwapBufferMode(false);
//glWindow.setUndecorated(true);
//glWindow.setFullscreen(true);
glWindow.addKeyListener(this);
glWindow.setVisible(true);

int extended = hmd.getEnabledCaps() & OvrLibrary.ovrHmdCaps.ovrHmdCap_ExtendDesktop;
System.out.println("extended="+extended);
Pointer windowHandle = new Pointer(window.getNativeSurface().getSurfaceHandle());
if (0 == OvrLibrary.INSTANCE.ovrHmd_AttachToWindow(hmd, windowHandle, eyeRenderViewport[0], eyeRenderViewport[0])) {
throw new IllegalStateException("step 6 - cant attach");
}

//step 7 - textures
System.out.println("step 7 - textures");
GLDrawableFactory glDrawableFactory = GLDrawableFactory.getFactory(glProfile);
if (glDrawableFactory.canCreateGLPbuffer(null, glProfile) == false) {
throw new IllegalStateException("cant create pbuffer");
}
GLCapabilities caps = new GLCapabilities(glProfile);
caps.setFBO(true);
caps.setHardwareAccelerated(true);
caps.setDoubleBuffered(false);
caps.setOnscreen(false);
caps.setAlphaBits(8);
caps.setRedBits(8);
caps.setBlueBits(8);
caps.setGreenBits(8);

GLOffscreenAutoDrawable leftEye = glDrawableFactory.createOffscreenAutoDrawable(null, caps, null, renderTargetEyeSize.w, renderTargetEyeSize.h);
GLOffscreenAutoDrawable rightEye = glDrawableFactory.createOffscreenAutoDrawable(null, caps, null, renderTargetEyeSize.w, renderTargetEyeSize.h);
System.out.println("left="+leftEye);
leftEye.addGLEventListener(new OffScreenEventListener(0));
rightEye.addGLEventListener(new OffScreenEventListener(1));

eyeTexture = (Texture[])new Texture().toArray(2);
eyeTexture[0].Header = new TextureHeader(renderTargetEyeSize, eyeRenderViewport[0]);
eyeTexture[1].Header = new TextureHeader(renderTargetEyeSize, eyeRenderViewport[1]);

//step 8 - loop
System.out.println("step 8 - loop");
//animator = new WHAnimator(75);
animator = new Animator();
animator.add(leftEye);
animator.add(rightEye);
animator.start();
}

public void renderScene(GL2 gl2) {
gl2.glLoadIdentity();
gl2.glTranslatef(-1.5f, 0.0f, -6.0f);
gl2.glBegin(GL2.GL_TRIANGLES);
gl2.glColor3f(1.0f, 0.0f, 0.0f);
gl2.glVertex3f( 0.0f, 1.0f, 0.0f);
gl2.glColor3f(0.0f, 1.0f, 0.0f);
gl2.glVertex3f(-1.0f,-1.0f, 0.0f);
gl2.glColor3f(0.0f, 0.0f, 1.0f);
gl2.glVertex3f( 1.0f,-1.0f, 0.0f);
gl2.glEnd();
gl2.glTranslatef(3.0f, 0.0f, 0.0f);
gl2.glBegin(GL2.GL_QUADS);
gl2.glColor3f(0.5f, 0.5f, 1.0f);
gl2.glVertex3f(-1.0f, 1.0f, 0.0f);
gl2.glVertex3f( 1.0f, 1.0f, 0.0f);
gl2.glVertex3f( 1.0f,-1.0f, 0.0f);
gl2.glVertex3f(-1.0f,-1.0f, 0.0f);
gl2.glEnd();
}

@Override
public void keyPressed(KeyEvent e) {
}

@Override
public void keyReleased(KeyEvent e) {
if (e.getKeyCode() == KeyEvent.VK_ESCAPE) {
hmd.destroy();
Hmd.shutdown();

animator.stop();
glWindow.destroy();
}
if(e.getKeyCode() == KeyEvent.VK_F5) {
new Thread() {
public void run() {
glWindow.setFullscreen(!glWindow.isFullscreen());
} }.start();
}
}

public static void main(String[] args) {
new TinyBox().run();
}
}

13 Replies

  • "whitehexagon" wrote:
    But I dont get any tracking now, now even a slight wobble. Light is on, and it works with RiftDemo, but I cant see the difference in the code. The matrix stuff is above my head, but I'm wondering if it is because of the reshape() code.


    You're applying the head tracking to the MatrixStack in your display function, but you're never applying it anywhere. The MatrixStack class is just a stack of matrices. Until you actually pass one of the matrices to OpenGL, it won't affect anything.

    You can load the contents into OpenGL with the following code:


    FloatBuffer mvBuffer = BufferUtils.createFloatBuffer(16);
    MatrixStack.MODELVIEW.top().fillFloatBuffer(mvBuffer, true);
    mvBuffer.rewind();
    glLoadMatrix(mvBuffer);


    The last function there is how I do it in LWJGL, but there must be a JOGL equivalent function. You can do this in your display() method or in your renderScene() method. Either way you need to get rid of the gl2.glLoadIdentity(); call in renderScene() as it will overwrite the matrix you've just loaded.
  • Wow, my first Rift experience at 75fps! :D thanks all for helping me reach this point! Ok, so it's only a cube, but I'm very happy to discover this old laptop can at least manage that, I was getting worried about the cost of the super computer self build I've been planning, at least I can delay that a while until the next gen is out and hopefully pickup some cheaper older parts.

    What's interesting is that even at 75fps I'm still seeing micro-stutter when moving my head around, but I'm going to assume this is SDK issue for now, since I've seen a few others mentioning this.

    Oh the trick for the 75fps was a bit strange. A test cycle involves connecting the Rift and dragging the title bar in the windows arrangement settings dialog (OSX) over to the Rift, making it the main screen. That leaves the dev environment on my laptop screen, but the demo starts up on the Rift, perfect. To disconnect the Rift I just have to remember to first of all drag the title bar back to the laptop, making it the primary display, otherwise the windows end up offscreen and hard to access.