Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Tosken1337's avatar
Tosken1337
Honored Guest
9 years ago

Rendering issues using LWJGL and libovr

Hi,

i want to implement a very basic and simple example of how to get the rift working with lwjgl and libovr.
I am using the latest version of lwjgl which supprts sdk version 1.5.

everything works fine, except that i am not able to get a rendered picture on the left eye.
I am using one texture spanning both eyes.
Each texture has a viewport of half the size and the right viewport starts at the middle of the texture.
I have checked everything and also checked the exaples in the sdk. I am not able to find what i am doing wrong.
Maybe you can have a quick look at the basic parts of my init and render code snippets to give me a hint?

When i use the viewports as described (half of the texture for each eye) the right eye image is correct but the left eye image has only the background clear color (no scene content)

When i use the full size viewport for each eye the left image is somehow wrong in terms of offset and movement when i move my head.

Initialization done on startup:
...
...
resolutionW = hmdDesc.Resolution().w();
resolutionH = hmdDesc.Resolution().h();

// FOV
for (int eye = 0; eye < 2; eye++) {
fovPorts[eye] = hmdDesc.DefaultEyeFov(eye);
}

// projection matrices
for (int eye = 0; eye < 2; eye++) {
projections[eye] = OVRMatrix4f.malloc();
OVRUtil.ovrMatrix4f_Projection(fovPorts[eye], 0.1f, 500f, OVRUtil.ovrProjection_None, projections[eye]);
}

// render description
for (int eye = 0; eye < 2; eye++) {
ovr_GetRenderDesc(session, eye, fovPorts[eye], eyeRenderDesc[eye]);
hmdToEyeOffsets.put(eye, eyeRenderDesc[eye].HmdToEyeOffset());
}

ovr_GetFovTextureSize(session, ovrEye_Left, fovPorts[ovrEye_Left], pixelsPerDisplayPixel, leftTextureSize);
ovr_GetFovTextureSize(session, ovrEye_Right, fovPorts[ovrEye_Right], pixelsPerDisplayPixel, rightTextureSize);
textureW = leftTextureSize.w() + rightTextureSize.w();
textureH = Math.max(leftTextureSize.h(), rightTextureSize.h());
OVRTextureSwapChainDesc swapChainDesc = OVRTextureSwapChainDesc.calloc().Type(ovrTexture_2D)
.ArraySize(1)
.Format(OVR_FORMAT_R8G8B8A8_UNORM_SRGB)
.Width(textureW)
.Height(textureH)
.MipLevels(1)
.SampleCount(1)
.StaticImage(false);
if (OVRGL.ovr_CreateTextureSwapChainGL(session, swapChainDesc, textureSetPB) != ovrSuccess) {
throw new RuntimeException("Failed to create Swap Texture Set");
}
swapChain = textureSetPB.get(0);

// create FrameBuffers for Oculus SDK generated textures
int textureCount = 0
ovr_GetTextureSwapChainLength(session, textureSetPB.get(0), chainLengthB);
textureCount = chainLengthB.get();

swapChainFbo = new FrameBufferObject[textureCount];
for (int i = 0; i < textureCount; i++) {
IntBuffer textureIdB = BufferUtils.createIntBuffer(1);
// Get texture id of the texture
OVRGL.ovr_GetTextureSwapChainBufferGL(session, swapChain, i, textureIdB);
int textureId = textureIdB.get();
Texture texture = Texture.wrap(textureId, textureW, textureH);
texture.setParameters(GL11.GL_LINEAR, GL13.GL_CLAMP_TO_BORDER);

swapChainFbo = FrameBufferObject.create();
swapChainFbo.addColorAttachment(texture, 0);
swapChainFbo.addDefaultDepthStencil(textureW, textureH);
}
// eye viewports
OVRRecti viewport[] = new OVRRecti[2];
viewport[0] = OVRRecti.calloc();
viewport[0].Pos().x(0);
viewport[0].Pos().y(0);
viewport[0].Size().w(textureW / 2);
viewport[0].Size().h(textureH);

viewport[1] = OVRRecti.calloc();
viewport[1].Pos().x(textureW / 2);
viewport[1].Pos().y(0);
viewport[1].Size().w(textureW / 2);
viewport[1].Size().h(textureH);

// single layer to present a VR scene
vrEyesLayer = OVRLayerEyeFov.calloc();
vrEyesLayer.Header().Type(ovrLayerType_EyeFov);
vrEyesLayer.Header().Flags(ovrLayerFlag_TextureOriginAtBottomLeft);
for (int eye = 0; eye < 2; eye++) {
vrEyesLayer.ColorTexture(eye, swapChain);
vrEyesLayer.Viewport(eye, viewport[eye]);
vrEyesLayer.Fov(eye, fovPorts[eye]);
}



//During rendering i use the following code each frame to get the latest pose for each eye

double displayMidpointSeconds = ovr_GetPredictedDisplayTime(session, 0);
ovr_GetTrackingState(session, displayMidpointSeconds, true, hmdState);

// get head pose and free hmdState
OVRPosef headPose = hmdState.HeadPose().ThePose();

//calculate eye poses
OVRUtil.ovr_CalcEyePoses(headPose, hmdToEyeOffsets, outEyePoses);

//Then i use these poses to calculate the current view matrix and render the scene for each eye with this matrix.
//I am also setting the viewport for each eye using glviewport

// Uses ovrGetCurrentFrameIndex to return the correct framebuffer
final FrameBufferObject currentFrameBuffer = hmd.getCurrentFrameBuffer();
for(int eye = 0; eye < 2; eye++) {
// Let the application perform frame rendering for each eye
final OVRRecti viewport = hmd.getViewport(eye);
final Matrix4f projM = hmd.getProjectionMatrix(eye);
final Matrix4f viewM = hmd.getViewMatrix(eye);

GL11.glViewport(viewport.Pos().x(), viewport.Pos().y(), viewport.Size().w(), viewport.Size().h());
onRenderFrame(elapsedMillis, viewM, projM, currentFrameBuffer);

}

ovr_CommitTextureSwapChain(session, swapChain);
int result = ovr_SubmitFrame(session, 0, null, layers);

1 Reply

  • Fixed! ;)
    I found the issue which causes this weired problems.
    I have called glClear for each eye. Even with the correct viewport set the glClear command clears the whole swapchain texture.