Forum Discussion
whitehexagon
12 years agoExplorer
JOVR with JOGL and SDK rendering
I'm trying to get something really basic running with OSX, JOVR, OpenGL, JOGL, and SDK 0-4-1 rendering. But I'm getting this fatal error at step 5.5 in the code below.
If I comment out step 5, then I get a different fatal error:
Could the second one be related to using two different texture objects. I know for the renderPose I had to do a manual copy into a contiguous array to avoid some JNA error, I'm wondering if the eyeTexture parameter is expecting something similar, although no idea how I would generate that.
I would be grateful if someone can test this on windows to make sure it's not something OSX specific. Unless someone has an idea where I'm going wrong? I havent put any of the tracking/FOV/viewport opengl stuff in yet until I get this base stuff working.
Code has dependencies on jogl-all.jar, gluegen-rt.jar, jna-4.1.0.jar, jovr-0.4.1.2.jar and the relevant natives for JOGL that load at runtime: gluegen-rt-natives-macosx-universal.jar, jogl-all-natives-macosx-universal.jar
C [libGL.dylib+0x18b5] glGetString+0xf
C [jna8916088414761583343.tmp+0x14282] OVR::CAPI::GL::DistortionRenderer::GraphicsState::GraphicsState()+0x52
C [jna8916088414761583343.tmp+0x12705] OVR::CAPI::GL::DistortionRenderer::GraphicsState::GraphicsState()+0x15
C [jna8916088414761583343.tmp+0x12367] OVR::CAPI::GL::DistortionRenderer::Initialize(ovrRenderAPIConfig_ const*, unsigned int)+0x57
C [jna8916088414761583343.tmp+0x86a2] OVR::CAPI::HMDState::ConfigureRendering(ovrEyeRenderDesc_*, ovrFovPort_ const*, ovrRenderAPIConfig_ const*, unsigned int)+0x7c2
C [jna8916088414761583343.tmp+0xeea9] ovrHmd_ConfigureRendering+0x59
If I comment out step 5, then I get a different fatal error:
C [jna3689317543873940567.tmp+0x3ba4] _ZN3OVR4CAPI18DistortionRenderer19SetLatencyTestColorEPh+0x54
C [jna3689317543873940567.tmp+0xf301] ovrHmd_EndFrame+0xe1
Could the second one be related to using two different texture objects. I know for the renderPose I had to do a manual copy into a contiguous array to avoid some JNA error, I'm wondering if the eyeTexture parameter is expecting something similar, although no idea how I would generate that.
I would be grateful if someone can test this on windows to make sure it's not something OSX specific. Unless someone has an idea where I'm going wrong? I havent put any of the tracking/FOV/viewport opengl stuff in yet until I get this base stuff working.
Code has dependencies on jogl-all.jar, gluegen-rt.jar, jna-4.1.0.jar, jovr-0.4.1.2.jar and the relevant natives for JOGL that load at runtime: gluegen-rt-natives-macosx-universal.jar, jogl-all-natives-macosx-universal.jar
import static com.oculusvr.capi.OvrLibrary.ovrEyeType.ovrEye_Count;
import static com.oculusvr.capi.OvrLibrary.ovrTrackingCaps.ovrTrackingCap_Orientation;
import javax.media.opengl.GL;
import javax.media.opengl.GL2;
import javax.media.opengl.GLAutoDrawable;
import javax.media.opengl.GLCapabilities;
import javax.media.opengl.GLDrawableFactory;
import javax.media.opengl.GLEventListener;
import javax.media.opengl.GLFBODrawable;
import javax.media.opengl.GLOffscreenAutoDrawable;
import javax.media.opengl.GLProfile;
import javax.media.opengl.fixedfunc.GLLightingFunc;
import javax.media.opengl.fixedfunc.GLMatrixFunc;
import javax.media.opengl.glu.GLU;
import com.jogamp.newt.Display;
import com.jogamp.newt.NewtFactory;
import com.jogamp.newt.Screen;
import com.jogamp.newt.Window;
import com.jogamp.newt.event.KeyEvent;
import com.jogamp.newt.event.KeyListener;
import com.jogamp.newt.opengl.GLWindow;
import com.jogamp.opengl.FBObject.TextureAttachment;
import com.jogamp.opengl.util.Animator;
import com.oculusvr.capi.EyeRenderDesc;
import com.oculusvr.capi.FovPort;
import com.oculusvr.capi.Hmd;
import com.oculusvr.capi.OvrLibrary;
import com.oculusvr.capi.OvrRecti;
import com.oculusvr.capi.OvrSizei;
import com.oculusvr.capi.OvrVector2i;
import com.oculusvr.capi.Posef;
import com.oculusvr.capi.RenderAPIConfig;
import com.oculusvr.capi.RenderAPIConfigHeader;
import com.oculusvr.capi.Texture;
import com.oculusvr.capi.TextureHeader;
import com.sun.jna.Pointer;
public class TinyBox implements KeyListener {
//JOGL
//private WHAnimator animator;
private Animator animator;
private GLWindow glWindow;
//Rift Specific
private Hmd hmd;
Posef eyeRenderPose[];
Texture eyeTexture[];
private int frameCount;
private OvrRecti[] eyeRenderViewport;
private final class OffScreenEventListener implements GLEventListener {
private final int eye;
public OffScreenEventListener(final int eye) {
this.eye = eye;
}
@Override
public void init(GLAutoDrawable drawable) {
final GL2 gl = drawable.getGL().getGL2();
gl.glClearColor(0.5f, 0.0f, 0.0f, 0.0f);
final float lightPos[] = { 5.0f, 5.0f, 10.0f, 0.0f };
gl.glLightfv(GLLightingFunc.GL_LIGHT0, GLLightingFunc.GL_POSITION, lightPos, 0);
gl.glEnable(GL.GL_CULL_FACE);
gl.glEnable(GLLightingFunc.GL_LIGHTING);
gl.glEnable(GLLightingFunc.GL_LIGHT0);
gl.glEnable(GL.GL_DEPTH_TEST);
gl.glEnable(GLLightingFunc.GL_NORMALIZE);
final GLFBODrawable fboDrawable = (GLFBODrawable)drawable.getDelegatedDrawable();
final TextureAttachment texAttach = fboDrawable.getColorbuffer(GL.GL_FRONT).getTextureAttachment();
int eyeTextureId = texAttach.getName();
System.out.println("eyeTextureId="+eyeTextureId);
eyeTexture[eye].TextureId = eyeTextureId;
}
@Override
public void dispose(GLAutoDrawable drawable) {
// TODO Auto-generated method stub
}
@Override
public void display(GLAutoDrawable drawable) {
GL2 gl2 = drawable.getGL().getGL2();
hmd.beginFrameTiming(++frameCount);
// Clear screen
gl2.glBindFramebuffer(GL2.GL_FRAMEBUFFER, 0);
gl2.glDrawBuffer(GL2.GL_BACK);
gl2.glViewport(0, 0, glWindow.getWidth(), glWindow.getHeight());
gl2.glDisable(GL2.GL_DEPTH_TEST);
gl2.glClear(GL2.GL_COLOR_BUFFER_BIT);
gl2.glEnable(GL2.GL_DEPTH_TEST);
gl2.glEnable(GL2.GL_CULL_FACE);
gl2.glFrontFace(GL2.GL_CW);
gl2.glLineWidth(3.0f);
gl2.glEnable(GL2.GL_LINE_SMOOTH);
gl2.glEnable(GL2.GL_BLEND);
gl2.glBlendFunc(GL2.GL_SRC_ALPHA, GL2.GL_ONE_MINUS_SRC_ALPHA);
eyeRenderPose = (Posef[])new Posef().toArray(2);
for (int eyeIndex = 0; eyeIndex < ovrEye_Count; eyeIndex++) {
int eye = hmd.EyeRenderOrder[eyeIndex];
eyeRenderPose[eye].Position = hmd.getEyePose(eye).Position;
eyeRenderPose[eye].Orientation = hmd.getEyePose(eye).Orientation;
gl2.glViewport(eyeRenderViewport[eye].Pos.x, eyeRenderViewport[eye].Pos.y, eyeRenderViewport[eye].Size.w, eyeRenderViewport[eye].Size.h);
renderScene(gl2);
}
gl2.glDisable(GL2.GL_DEPTH_TEST);
gl2.glBindFramebuffer(GL2.GL_FRAMEBUFFER, 0);
gl2.glDrawBuffer(GL2.GL_BACK);
gl2.glViewport(0, 0, glWindow.getWidth(), glWindow.getHeight());
gl2.glDisable(GL2.GL_DEPTH_TEST);
gl2.glClear(GL2.GL_COLOR_BUFFER_BIT);
hmd.endFrame(eyeRenderPose, eyeTexture);
}
@Override
public void reshape(GLAutoDrawable drawable, int x, int y, int width, int height) {
GL2 gl2 = drawable.getGL().getGL2();
gl2.glMatrixMode(GLMatrixFunc.GL_PROJECTION);
gl2.glLoadIdentity();
GLU glu = new GLU();
glu.gluPerspective(45.0f, ((float) glWindow.getWidth() / (float) glWindow.getHeight()), 0.1f, 10000.0f);
// gl2.glFrustum(-4f, 4f, -2f, 2f, 5.0f, 100.0f);
gl2.glMatrixMode(GLMatrixFunc.GL_MODELVIEW);
gl2.glLoadIdentity();
gl2.glTranslatef(0.0f, 0.0f, -40.0f);
gl2.glShadeModel(GL2.GL_SMOOTH);
gl2.glEnable(GL2.GL_DEPTH_TEST);
gl2.glDepthFunc(GL2.GL_LEQUAL);
gl2.glHint(GL2.GL_PERSPECTIVE_CORRECTION_HINT, GL2.GL_NICEST);
}
} //end inner class
public void run() {
frameCount = -1;
//step 1
Hmd.initialize();
try {
Thread.sleep(400);
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}
//step 2
hmd = Hmd.create(0); //assume 1 device at index 0
if (hmd == null) {
System.out.println("null hmd");
hmd = Hmd.createDebug(OvrLibrary.ovrHmdType.ovrHmd_DK2);
}
//step 3
OvrSizei resolution = hmd.Resolution;
System.out.println("resolution= "+resolution.w+"x"+resolution.h);
OvrSizei recommendedTex0Size = hmd.getFovTextureSize(OvrLibrary.ovrEyeType.ovrEye_Left, hmd.DefaultEyeFov[0], 1.0f);
OvrSizei recommendedTex1Size = hmd.getFovTextureSize(OvrLibrary.ovrEyeType.ovrEye_Right, hmd.DefaultEyeFov[1], 1.0f);
System.out.println("left= "+recommendedTex0Size.w+"x"+recommendedTex0Size.h);
System.out.println("right= "+recommendedTex1Size.w+"x"+recommendedTex1Size.h);
int displayW = recommendedTex0Size.w + recommendedTex1Size.w;
int displayH = Math.max(recommendedTex0Size.h, recommendedTex1Size.h);
OvrSizei renderTargetEyeSize = new OvrSizei(displayW / 2, displayH); //size of single eye
eyeRenderViewport = new OvrRecti[]{new OvrRecti(), new OvrRecti()};
eyeRenderViewport[0].Pos = new OvrVector2i(0, 0);
eyeRenderViewport[0].Size = renderTargetEyeSize;
eyeRenderViewport[1].Pos = eyeRenderViewport[0].Pos;
eyeRenderViewport[1].Size = renderTargetEyeSize;
//step 4
if (hmd.configureTracking(ovrTrackingCap_Orientation, 0) == 0) {
throw new IllegalStateException("Unable to start the sensor");
}
//step 5 - configure rendering
System.out.println("step 5");
RenderAPIConfigHeader configHeader = new RenderAPIConfigHeader(resolution, 1);
RenderAPIConfig apiConfig = new RenderAPIConfig(configHeader, new int[15]);
int distortionCaps = OvrLibrary.ovrHmdCaps.ovrHmdCap_LowPersistence | OvrLibrary.ovrHmdCaps.ovrHmdCap_DynamicPrediction;
//FovPort[] eyeFov = (FovPort[])new FovPort().toArray(2);
FovPort[] eyeFov = new FovPort[]{hmd.DefaultEyeFov[0], hmd.DefaultEyeFov[1]};
EyeRenderDesc eyeRenderDescs[] = (EyeRenderDesc[])new EyeRenderDesc().toArray(2);
System.out.println("step 5.5");
if (0 == OvrLibrary.INSTANCE.ovrHmd_ConfigureRendering(hmd, apiConfig, distortionCaps, eyeFov, eyeRenderDescs)) {
throw new IllegalStateException("step 5 - cant configure");
}
System.out.println("step 5.6");
hmd.setEnabledCaps(distortionCaps); //need this or does the above do this, need above?
//step 6 - opengl window
System.out.println("step 6");
//Display.dumpDisplayList(""); //only gives: DisplayList[] entries: 0 - main
final Display display = NewtFactory.createDisplay(null);
final Screen screen = NewtFactory.createScreen(display, 0);
GLProfile glProfile = GLProfile.get(GLProfile.GL2);
System.out.println("got: " + glProfile.getImplName());
final Window window = NewtFactory.createWindow(screen, new GLCapabilities(glProfile));
window.setSize(displayW, displayH);
glWindow = GLWindow.create(window);
//glWindow.setAutoSwapBufferMode(false);
//glWindow.setUndecorated(true);
//glWindow.setFullscreen(true);
glWindow.addKeyListener(this);
glWindow.setVisible(true);
int extended = hmd.getEnabledCaps() & OvrLibrary.ovrHmdCaps.ovrHmdCap_ExtendDesktop;
System.out.println("extended="+extended);
Pointer windowHandle = new Pointer(window.getNativeSurface().getSurfaceHandle());
if (0 == OvrLibrary.INSTANCE.ovrHmd_AttachToWindow(hmd, windowHandle, eyeRenderViewport[0], eyeRenderViewport[0])) {
throw new IllegalStateException("step 6 - cant attach");
}
//step 7 - textures
System.out.println("step 7 - textures");
GLDrawableFactory glDrawableFactory = GLDrawableFactory.getFactory(glProfile);
if (glDrawableFactory.canCreateGLPbuffer(null, glProfile) == false) {
throw new IllegalStateException("cant create pbuffer");
}
GLCapabilities caps = new GLCapabilities(glProfile);
caps.setFBO(true);
caps.setHardwareAccelerated(true);
caps.setDoubleBuffered(false);
caps.setOnscreen(false);
caps.setAlphaBits(8);
caps.setRedBits(8);
caps.setBlueBits(8);
caps.setGreenBits(8);
GLOffscreenAutoDrawable leftEye = glDrawableFactory.createOffscreenAutoDrawable(null, caps, null, renderTargetEyeSize.w, renderTargetEyeSize.h);
GLOffscreenAutoDrawable rightEye = glDrawableFactory.createOffscreenAutoDrawable(null, caps, null, renderTargetEyeSize.w, renderTargetEyeSize.h);
System.out.println("left="+leftEye);
leftEye.addGLEventListener(new OffScreenEventListener(0));
rightEye.addGLEventListener(new OffScreenEventListener(1));
eyeTexture = (Texture[])new Texture().toArray(2);
eyeTexture[0].Header = new TextureHeader(renderTargetEyeSize, eyeRenderViewport[0]);
eyeTexture[1].Header = new TextureHeader(renderTargetEyeSize, eyeRenderViewport[1]);
//step 8 - loop
System.out.println("step 8 - loop");
//animator = new WHAnimator(75);
animator = new Animator();
animator.add(leftEye);
animator.add(rightEye);
animator.start();
}
public void renderScene(GL2 gl2) {
gl2.glLoadIdentity();
gl2.glTranslatef(-1.5f, 0.0f, -6.0f);
gl2.glBegin(GL2.GL_TRIANGLES);
gl2.glColor3f(1.0f, 0.0f, 0.0f);
gl2.glVertex3f( 0.0f, 1.0f, 0.0f);
gl2.glColor3f(0.0f, 1.0f, 0.0f);
gl2.glVertex3f(-1.0f,-1.0f, 0.0f);
gl2.glColor3f(0.0f, 0.0f, 1.0f);
gl2.glVertex3f( 1.0f,-1.0f, 0.0f);
gl2.glEnd();
gl2.glTranslatef(3.0f, 0.0f, 0.0f);
gl2.glBegin(GL2.GL_QUADS);
gl2.glColor3f(0.5f, 0.5f, 1.0f);
gl2.glVertex3f(-1.0f, 1.0f, 0.0f);
gl2.glVertex3f( 1.0f, 1.0f, 0.0f);
gl2.glVertex3f( 1.0f,-1.0f, 0.0f);
gl2.glVertex3f(-1.0f,-1.0f, 0.0f);
gl2.glEnd();
}
@Override
public void keyPressed(KeyEvent e) {
}
@Override
public void keyReleased(KeyEvent e) {
if (e.getKeyCode() == KeyEvent.VK_ESCAPE) {
hmd.destroy();
Hmd.shutdown();
animator.stop();
glWindow.destroy();
}
if(e.getKeyCode() == KeyEvent.VK_F5) {
new Thread() {
public void run() {
glWindow.setFullscreen(!glWindow.isFullscreen());
} }.start();
}
}
public static void main(String[] args) {
new TinyBox().run();
}
}
13 Replies
- nephHonored GuestHi.
I'm not a JOGL user so sorry if it doesn't help, but:
In step 5, have you tried calling hmd.configureRendering() instead (according to the JOVR example:https://github.com/jherico/jocular-examples/blob/master/src/main/java/org/saintandreas/vr/RiftApp.java#L76).
And also, in the rendering loop, you recreate the eyeRenderPose array every time. Maybe this is related to the second error (considering that it seems to be picky about the memory usage). - whitehexagonExplorer
"neph" wrote:
Hi.
I'm not a JOGL user so sorry if it doesn't help, but:
In step 5, have you tried calling hmd.configureRendering() instead (according to the JOVR example:https://github.com/jherico/jocular-examples/blob/master/src/main/java/org/saintandreas/vr/RiftApp.java#L76).
And also, in the rendering loop, you recreate the eyeRenderPose array every time. Maybe this is related to the second error (considering that it seems to be picky about the memory usage).
Thanks for the two suggestions. I have just tried both changes, but get the same errors for both.
I also realized that the display(...) call gets called for each eye and I have adjusted now the hmd calls:if (eye == 0) {
hmd.beginFrameTiming(++frameCount);
}
...rendering code
if (eye == 1) {
hmd.endFrame(eyeRenderPose, eyeTexture);
}
still the same second error though. - jhericoAdventurer
"whitehexagon" wrote:
I'm trying to get something really basic running with OSX, JOVR, OpenGL, JOGL, and SDK 0-4-1 rendering. But I'm getting this fatal error at step 5.5 in the code below.
RenderAPIConfigHeader configHeader = new RenderAPIConfigHeader(resolution, 1);
RenderAPIConfig apiConfig = new RenderAPIConfig(configHeader, new int[15]);
I don't know that this will work. I simply use the following:"whitehexagon" wrote:
int distortionCaps = OvrLibrary.ovrHmdCaps.ovrHmdCap_LowPersistence | OvrLibrary.ovrHmdCaps.ovrHmdCap_DynamicPrediction;
These are not distortion caps. Distortion caps are of the type OvrLibrary.ovrDistortionCaps. I use the following:"whitehexagon" wrote:
//FovPort[] eyeFov = (FovPort[])new FovPort().toArray(2);
FovPort[] eyeFov = new FovPort[]{hmd.DefaultEyeFov[0], hmd.DefaultEyeFov[1]};
The commented out version is the correct one. Alternatively, you can simply pass in hmd.DefaultEyeFov."whitehexagon" wrote:
if (0 == OvrLibrary.INSTANCE.ovrHmd_ConfigureRendering(hmd, apiConfig, distortionCaps, eyeFov, eyeRenderDescs)) {
throw new IllegalStateException("step 5 - cant configure");
}
I'd suggest using the configureRendering() helper method on Hmd which encapsulates a number of parameter checks and allocates the output EyeRenderDesc parameters for you.
RenderAPIConfig rc = new RenderAPIConfig();
rc.Header.RTSize = hmd.Resolution;
rc.Header.Multisample = 1;
int distortionCaps =
ovrDistortionCap_Chromatic |
ovrDistortionCap_TimeWarp |
ovrDistortionCap_Vignette;
eyeRenderDescs = hmd.configureRendering(
rc, distortionCaps, hmd.DefaultEyeFov);"whitehexagon" wrote:
hmd.setEnabledCaps(distortionCaps); //need this or does the above do this, need above?
distortionCaps != hmdCaps. - whitehexagonExplorerThanks! I've fixed few issues relating to your suggestions, and those 2 errors are gone! I suspect it was also the order of the configure call, ie. that it was outside of a gl context.
I'm now rendering fine into my 2 offscreen buffers (validated via 2 pixel grabs to 2 files - one for each eye)
But I dont get anything rendered to my native Window. I attach the hmd to the window as per code below. Since I pass the textureId s down to the hmd at the end of a render loop, I was expecting the SDK to draw the textures into the window after distortion etc had been applied. Is there something extra I need to make that happen? or am I misunderstanding the SDK? Am I suspose the render the 2 textures there myself?
Pointer windowHandle = new Pointer(window.getNativeSurface().getSurfaceHandle());
if (0 == OvrLibrary.INSTANCE.ovrHmd_AttachToWindow(hmd, windowHandle, eyeRenderViewport[0], eyeRenderViewport[0])) {
throw new IllegalStateException("step 7 - cant attach");
} - bluenoteExplorerIf you follow the standard rendering loop (just compare with this) then everything should work fine. There is probably just a minor bug in your code. What may happen is that some OpenGL related state is set in a way to cause trouble with the SDK. For instance, play around with unbinding all your framebuffer/textures/VAO etc. You may want to check the GL error state at various places in your rendering loop to see if any the the SDK calls fails (for some reason in my code the hmd.endFrame always fails just for the very first frame I render, but in general you should not see a GL error after calling an SDK function).
- whitehexagonExplorerI've managed to get the RiftApp code running locally now, albeit with a bit of a struggle determining all the dependencies, especially the Resources. It is using LWJGL which I'm unfamiliar with. I think this is why I'm struggling to find the equivalent pattern for JOGL.
I create a NEWT (native) window, but dont do anything with it apart from SDK 'attach' to it. But I'm starting to suspect that this attach call is related only to this new 'Direct to rift' mode?? which OSX does not have. If that is the case then I'm still left wondering how I tie my offscreen rendering into my onscreen native window.
Anyway I added the error checking, and it is coming from the endFrame call, error 1286.
I'm clearly missing something fundamental here. - bluenoteExplorerThe "tie offscreen rendering to the onscreen native window" is just what happens in the endFrame() call -- you just have to ensure that all the things involved are set up correctly.
Error 1286 is a GL_INVALID_FRAMEBUFFER_OPERATION. So you should probably check how you initialize, activate, and deactivate your framebuffer textures (see jherico's reference implementation). I still think you are almost there :). - whitehexagonExplorerCracked it, kinda...
I got rid of my 2 offscreen buffers, and created 2 FBOs inside of a gl context attached to the NEWT window. That seems to have solved this 'tie buffer to screen' step I needed.
What I'm missing now is color on my geometry, its just gray. Also head tracking just seems to wobble the scene very slightly -might be noise.
What I have noticed that even with such a simple scene, this tired laptop only hits 30fps, which is suspicious because that's exactly what I get from the LWGJL RiftApp, and also the same from the far more complex SDK OculusWorldDemo.
Oh and the HSW is already driving me crazy!import static com.oculusvr.capi.OvrLibrary.ovrDistortionCaps.ovrDistortionCap_Chromatic;
import static com.oculusvr.capi.OvrLibrary.ovrDistortionCaps.ovrDistortionCap_TimeWarp;
import static com.oculusvr.capi.OvrLibrary.ovrDistortionCaps.ovrDistortionCap_Vignette;
import static com.oculusvr.capi.OvrLibrary.ovrTrackingCaps.ovrTrackingCap_Orientation;
import static com.oculusvr.capi.OvrLibrary.ovrTrackingCaps.ovrTrackingCap_Position;
import javax.media.opengl.GL;
import javax.media.opengl.GL2;
import javax.media.opengl.GLAutoDrawable;
import javax.media.opengl.GLCapabilities;
import javax.media.opengl.GLEventListener;
import javax.media.opengl.GLProfile;
import javax.media.opengl.fixedfunc.GLLightingFunc;
import javax.media.opengl.fixedfunc.GLMatrixFunc;
import javax.media.opengl.glu.GLU;
import org.saintandreas.gl.MatrixStack;
import org.saintandreas.math.Matrix4f;
import org.saintandreas.vr.RiftUtils;
import com.jogamp.newt.Display;
import com.jogamp.newt.NewtFactory;
import com.jogamp.newt.Screen;
import com.jogamp.newt.Window;
import com.jogamp.newt.event.KeyEvent;
import com.jogamp.newt.event.KeyListener;
import com.jogamp.newt.opengl.GLWindow;
import com.jogamp.opengl.FBObject;
import com.jogamp.opengl.FBObject.TextureAttachment;
import com.jogamp.opengl.util.Animator;
import com.oculusvr.capi.EyeRenderDesc;
import com.oculusvr.capi.FovPort;
import com.oculusvr.capi.Hmd;
import com.oculusvr.capi.OvrLibrary;
import com.oculusvr.capi.OvrLibrary.ovrEyeType;
import com.oculusvr.capi.OvrRecti;
import com.oculusvr.capi.OvrSizei;
import com.oculusvr.capi.OvrVector2i;
import com.oculusvr.capi.Posef;
import com.oculusvr.capi.RenderAPIConfig;
import com.oculusvr.capi.Texture;
import com.oculusvr.capi.TextureHeader;
public class TinyBox3 implements KeyListener {
//JOGL
//private WHAnimator animator;
private Animator animator;
private GLWindow glWindow;
//Rift Specific
private Hmd hmd;
private int frameCount;
private EyeRenderDesc eyeRenderDescs[];
private final OvrRecti[] eyeRenderViewport = (OvrRecti[]) new OvrRecti().toArray(2);
private final Posef eyeRenderPose[] = (Posef[]) new Posef().toArray(2);
private final Texture eyeTextures[] = (Texture[]) new Texture().toArray(2);
private final FovPort fovPorts[] = (FovPort[]) new FovPort().toArray(2);
private final Matrix4f projections[] = new Matrix4f[2];
private final class DK2EventListener implements GLEventListener {
private final FBObject leftEye;
private final FBObject rightEye;
public DK2EventListener() {
leftEye = new FBObject();
rightEye = new FBObject();
}
@Override
public void init(GLAutoDrawable drawable) {
final GL2 gl = drawable.getGL().getGL2();
gl.glClearColor(0.5f, 0.5f, 0.0f, 0.0f);
final float lightPos[] = { 5.0f, 5.0f, 10.0f, 0.0f };
gl.glLightfv(GLLightingFunc.GL_LIGHT0, GLLightingFunc.GL_POSITION, lightPos, 0);
gl.glEnable(GLLightingFunc.GL_LIGHTING);
gl.glEnable(GLLightingFunc.GL_LIGHT0);
gl.glEnable(GL.GL_DEPTH_TEST);
gl.glEnable(GLLightingFunc.GL_NORMALIZE);
RenderAPIConfig rc = new RenderAPIConfig();
rc.Header.RTSize = hmd.Resolution;
rc.Header.Multisample = 1;
int distortionCaps = ovrDistortionCap_Chromatic | ovrDistortionCap_TimeWarp | ovrDistortionCap_Vignette;
eyeRenderDescs = hmd.configureRendering(rc, distortionCaps, fovPorts);
leftEye.reset(gl, eyeRenderViewport[ovrEyeType.ovrEye_Left].Size.w, eyeRenderViewport[ovrEyeType.ovrEye_Left].Size.h, 0, false);
rightEye.reset(gl, eyeRenderViewport[ovrEyeType.ovrEye_Right].Size.w, eyeRenderViewport[ovrEyeType.ovrEye_Right].Size.h, 0, false);
leftEye.detachAllColorbuffer(gl);
rightEye.detachAllColorbuffer(gl);
leftEye.detachAll(gl); //clears right eye...
TextureAttachment leftTA = leftEye.attachTexture2D(gl, 0, true);
TextureAttachment rightTA = rightEye.attachTexture2D(gl, 0, true);
eyeTextures[ovrEyeType.ovrEye_Left].TextureId = leftTA.getName();
eyeTextures[ovrEyeType.ovrEye_Right].TextureId = rightTA.getName();
}
@Override
public void dispose(GLAutoDrawable drawable) {
// TODO Auto-generated method stub
}
@Override
public void display(GLAutoDrawable drawable) {
GL2 gl2 = drawable.getGL().getGL2();
hmd.beginFrameTiming(++frameCount);
for (int eyeIndex = 0; eyeIndex < ovrEyeType.ovrEye_Count; eyeIndex++){
int eye = hmd.EyeRenderOrder[eyeIndex];
MatrixStack.PROJECTION.set(projections[eye]);
MatrixStack mv = MatrixStack.MODELVIEW;
mv.push();
{
Posef pose = hmd.getEyePose(eye);
eyeRenderPose[eye].Orientation = pose.Orientation;
eyeRenderPose[eye].Position = pose.Position;
gl2.glViewport(eyeRenderViewport[eye].Pos.x, eyeRenderViewport[eye].Pos.y, eyeRenderViewport[eye].Size.w, eyeRenderViewport[eye].Size.h);
mv.preTranslate(RiftUtils.toVector3f(eyeRenderPose[eye].Position).mult(-1));
mv.preRotate(RiftUtils.toQuaternion(eyeRenderPose[eye].Orientation).inverse());
mv.preTranslate(RiftUtils.toVector3f(eyeRenderDescs[eye].ViewAdjust));
gl2.glBindFramebuffer(GL2.GL_FRAMEBUFFER, eyeTextures[eye].TextureId);
gl2.glClear(GL2.GL_COLOR_BUFFER_BIT);
renderScene(gl2);
gl2.glBindFramebuffer(GL2.GL_FRAMEBUFFER, 0);
// gl2.glDisable(GL2.GL_TEXTURE_2D);
}
mv.pop();
}
hmd.endFrame(eyeRenderPose, eyeTextures);
}
@Override
public void reshape(GLAutoDrawable drawable, int x, int y, int width, int height) {
GL2 gl2 = drawable.getGL().getGL2();
gl2.glMatrixMode(GLMatrixFunc.GL_PROJECTION);
gl2.glLoadIdentity();
GLU glu = new GLU();
glu.gluPerspective(45.0f, ((float) width / (float) height), 0.1f, 10000.0f);
gl2.glMatrixMode(GLMatrixFunc.GL_MODELVIEW);
gl2.glLoadIdentity();
}
} //end inner class
public void run() {
frameCount = -1;
//step 1 - hmd init
Hmd.initialize();
try {
Thread.sleep(400);
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}
//step 2 - hmd create
hmd = Hmd.create(0); //assume 1 device at index 0
if (hmd == null) {
System.out.println("null hmd");
hmd = Hmd.createDebug(OvrLibrary.ovrHmdType.ovrHmd_DK2);
}
//step 3 - hmd size queries
OvrSizei resolution = hmd.Resolution;
System.out.println("resolution= "+resolution.w+"x"+resolution.h);
OvrSizei recommendedTex0Size = hmd.getFovTextureSize(OvrLibrary.ovrEyeType.ovrEye_Left, hmd.DefaultEyeFov[0], 1.0f);
OvrSizei recommendedTex1Size = hmd.getFovTextureSize(OvrLibrary.ovrEyeType.ovrEye_Right, hmd.DefaultEyeFov[1], 1.0f);
System.out.println("left= "+recommendedTex0Size.w+"x"+recommendedTex0Size.h);
System.out.println("right= "+recommendedTex1Size.w+"x"+recommendedTex1Size.h);
int displayW = recommendedTex0Size.w + recommendedTex1Size.w;
int displayH = Math.max(recommendedTex0Size.h, recommendedTex1Size.h);
OvrSizei renderTargetEyeSize = new OvrSizei(displayW / 2, displayH); //size of single eye
System.out.println("using eye size "+renderTargetEyeSize.w+"x"+renderTargetEyeSize.h);
eyeRenderViewport[0].Pos = new OvrVector2i(0, 0);
eyeRenderViewport[0].Size = renderTargetEyeSize;
eyeRenderViewport[1].Pos = eyeRenderViewport[0].Pos;
eyeRenderViewport[1].Size = renderTargetEyeSize;
eyeTextures[0].Header = new TextureHeader(renderTargetEyeSize, eyeRenderViewport[0]);
eyeTextures[1].Header = new TextureHeader(renderTargetEyeSize, eyeRenderViewport[1]);
//step 4 - tracking
System.out.println("step 4 - tracking");
if (hmd.configureTracking(ovrTrackingCap_Orientation | ovrTrackingCap_Position, 0) == 0) {
throw new IllegalStateException("Unable to start the sensor");
}
//step 5 - FOV
System.out.println("step 5 - FOV");
for (int eye = 0; eye < 2; ++eye) {
fovPorts[eye] = hmd.DefaultEyeFov[eye];
projections[eye] = RiftUtils.toMatrix4f(
Hmd.getPerspectiveProjection(
fovPorts[eye], 0.1f, 1000000f, true));
}
//step 6 - opengl window
System.out.println("step 6 - window");
//Display.dumpDisplayList(""); //only gives: DisplayList[] entries: 0 - main
final Display display = NewtFactory.createDisplay("tiny");
final Screen screen = NewtFactory.createScreen(display, 0);
GLProfile glProfile = GLProfile.get(GLProfile.GL2);
System.out.println("got: " + glProfile.getImplName());
final Window window = NewtFactory.createWindow(screen, new GLCapabilities(glProfile));
window.setSize(displayW, displayH);
glWindow = GLWindow.create(window);
glWindow.setAutoSwapBufferMode(false);
glWindow.setUndecorated(true);
glWindow.setFullscreen(true);
glWindow.addKeyListener(this);
glWindow.addGLEventListener(new DK2EventListener());
glWindow.setVisible(true);
//step 7 - loop
System.out.println("step 7 - loop");
//animator = new WHAnimator(75);
animator = new Animator();
animator.add(glWindow);
animator.start();
}
public void renderScene(GL2 gl2) {
gl2.glLoadIdentity();
gl2.glTranslatef(-1.5f, 0.0f, -6.0f);
gl2.glBegin(GL2.GL_TRIANGLES);
gl2.glColor3f(1.0f, 0.0f, 0.0f);
gl2.glVertex3f( 0.0f, 1.0f, 0.0f);
gl2.glColor3f(0.0f, 1.0f, 0.0f);
gl2.glVertex3f(-1.0f,-1.0f, 0.0f);
gl2.glColor3f(0.0f, 0.0f, 1.0f);
gl2.glVertex3f( 1.0f,-1.0f, 0.0f);
gl2.glEnd();
gl2.glTranslatef(3.0f, 0.0f, 0.0f);
gl2.glBegin(GL2.GL_QUADS);
gl2.glColor3f(0.5f, 0.5f, 1.0f);
gl2.glVertex3f(-1.0f, 1.0f, 0.0f);
gl2.glVertex3f( 1.0f, 1.0f, 0.0f);
gl2.glVertex3f( 1.0f,-1.0f, 0.0f);
gl2.glVertex3f(-1.0f,-1.0f, 0.0f);
gl2.glEnd();
}
@Override
public void keyPressed(KeyEvent e) {
}
@Override
public void keyReleased(KeyEvent e) {
if (e.getKeyCode() == KeyEvent.VK_ESCAPE) {
hmd.destroy();
Hmd.shutdown();
animator.stop();
glWindow.destroy();
System.exit(0);
}
if(e.getKeyCode() == KeyEvent.VK_F5) {
new Thread() {
public void run() {
glWindow.setFullscreen(!glWindow.isFullscreen());
} }.start();
}
hmd.dismissHSWDisplay();
}
public static void main(String[] args) {
new TinyBox3().run();
}
} - jhericoAdventurer
"whitehexagon" wrote:
Cracked it, kinda...
gl2.glBindFramebuffer(GL2.GL_FRAMEBUFFER, eyeTextures[eye].TextureId);
gl2.glClear(GL2.GL_COLOR_BUFFER_BIT);
renderScene(gl2);
gl2.glBindFramebuffer(GL2.GL_FRAMEBUFFER, 0);
Framebuffers are not textures. The second parameter of glBindFramebuffer must be the framebuffer ID, not the color attachment. Additionally, your framebuffers will need both depth and color attachments for any reasonable rendering tasks. - whitehexagonExplorerA big step closer! So the FBObject from JOGL seemed to imply it had a texture buffer somehow tied to the color buffer, but after a bit of messing around I swapped it out for something elect put together for the tiny room demo. It works!
Color issue was related to lighting.
But I dont get any tracking now, now even a slight wobble. Light is on, and it works with RiftDemo, but I cant see the difference in the code. The matrix stuff is above my head, but I'm wondering if it is because of the reshape() code.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 12 months ago
- 6 months ago
- 3 years ago
- 11 years ago