Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
elect's avatar
elect
Honored Guest
12 years ago

Porting RoomTiny Client distortion to Jogl

I am trying to port the RoomTiny example to jogl (Opengl binding for Java)

I dont know DirectX unfortunately, although it looks quite similar to OpenGL..

My question is where the texture is binded to the context before the draw command

In the RenderTiny_D3D11_Device.cpp I see:

void RenderDevice::Render(const Matrix4f& view, Model* model)
{
// Store data in buffers if not already
if (!model->VertexBuffer)
{
Ptr<Buffer> vb = *CreateBuffer();
vb->Data(Buffer_Vertex, &model->Vertices[0], model->Vertices.GetSize() * sizeof(Vertex));
model->VertexBuffer = vb;
}
if (!model->IndexBuffer)
{
Ptr<Buffer> ib = *CreateBuffer();
ib->Data(Buffer_Index, &model->Indices[0], model->Indices.GetSize() * 2);
model->IndexBuffer = ib;
}

Render(model->Fill ? model->Fill : DefaultFill,
model->VertexBuffer, model->IndexBuffer,sizeof(Vertex),
view, 0, (unsigned)model->Indices.GetSize(), model->GetPrimType());
}


//Cut down one for ORT for simplicity
void RenderDevice::Render(const ShaderFill* fill, Buffer* vertices, Buffer* indices, int stride)
{
Render(fill, vertices, indices, stride, Matrix4f(), 0,(int)vertices->GetSize(), Prim_Triangles, false);
}


void RenderDevice::Render(const ShaderFill* fill, Buffer* vertices, Buffer* indices, int stride,
const Matrix4f& matrix, int offset, int count, PrimitiveType rprim, bool updateUniformData)
{

if(((ShaderFill*)fill)->GetInputLayout() != NULL)
Context->IASetInputLayout((ID3D11InputLayout*)((ShaderFill*)fill)->GetInputLayout());
else
Context->IASetInputLayout(ModelVertexIL);

if (indices)
{
Context->IASetIndexBuffer(((Buffer*)indices)->GetBuffer(), DXGI_FORMAT_R16_UINT, 0);
}

ID3D11Buffer* vertexBuffer = ((Buffer*)vertices)->GetBuffer();
UINT vertexStride = stride;

UINT vertexOffset = offset;
Context->IASetVertexBuffers(0, 1, &vertexBuffer, &vertexStride, &vertexOffset);

ShaderSet* shaders = ((ShaderFill*)fill)->GetShaders();

ShaderBase* vshader = ((ShaderBase*)shaders->GetShader(Shader_Vertex));
unsigned char* vertexData = vshader->UniformData;
if (vertexData)
{
// TODO: some VSes don't start with StandardUniformData!
if ( updateUniformData )
{
StandardUniformData* stdUniforms = (StandardUniformData*) vertexData;
stdUniforms->View = matrix.Transposed();
stdUniforms->Proj = StdUniforms.Proj;
}
UniformBuffers[Shader_Vertex]->Data(Buffer_Uniform, vertexData, vshader->UniformsSize);
vshader->SetUniformBuffer(UniformBuffers[Shader_Vertex]);
}

for(int i = Shader_Vertex + 1; i < Shader_Count; i++)
if (shaders->GetShader(i))
{
((ShaderBase*)shaders->GetShader(i))->UpdateBuffer(UniformBuffers[i]);
((ShaderBase*)shaders->GetShader(i))->SetUniformBuffer(UniformBuffers[i]);
}

D3D11_PRIMITIVE_TOPOLOGY prim;
switch(rprim)
{
case Prim_Triangles:
prim = D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST;
break;
case Prim_Lines:
prim = D3D11_PRIMITIVE_TOPOLOGY_LINELIST;
break;
case Prim_TriangleStrip:
prim = D3D11_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP;
break;
default:
OVR_ASSERT(0);
return;
}
Context->IASetPrimitiveTopology(prim);

fill->Set(rprim);

if (indices)
{
Context->DrawIndexed(count, 0, 0);
}
else
{
Context->Draw(count, 0);
}
}


In the

void RenderDevice::Render(const Matrix4f& view, Model* model)

basically if vbo and ibo are not yet initialized, then they created.

While in the

void RenderDevice::Render(const ShaderFill* fill, Buffer* vertices, Buffer* indices, int stride,
const Matrix4f& matrix, int offset, int count, PrimitiveType rprim, bool updateUniformData)

Shaders get binded together with several uniforms. Then the primitive type is choose and then the draw command..

Where the texture binding get place?

10 Replies

  • "elect" wrote:
    Where the texture binding get place?


    Texture binding occurs inside the SDK, not in the TinyRoom demo.
  • elect's avatar
    elect
    Honored Guest
    I already ported most of the program, but I am stuck on an annoying bug since a couple of days and I am going crazy

    After a couple of days, I tried to make a new simplified copy in order to find the bug, but I experience exactly the same..

    I ask if someone out there would like to help me

    I can render properly only the floor or the ceiling, not both

    https://github.com/elect86/JavaOculusRoomTiny/tree/master/src/roomTinySimplified

    Only floor

    30.png

    Only ceiling

    31.png

    Both

    32.png

    The program is pretty simple, there is (at the moment) only one glsl program for litTextures

    You add Model and in the rendering loop you update lights (in an ubo) and loop the models themselves rendering them

    In the same model rendering, I initialize everything about, vbo, ibo and texture

    The vbo data are 100% exact, I scanned each vertex attribute one by one..
  • rtweed's avatar
    rtweed
    Honored Guest
    I cannot shed much light on exactly what's gone wrong or why, but from the looks of it, the surface normals have been flipped and/or the order of the vertices has been reversed so the polygons have the wrong "handedness" (clockwise vs anticlockwise). If you can turn off backface culling, you might see more stuff rendered which could give you a better idea what's going on.

    I've seen things like this happen when running a batch of draw commands, but not closing off one group before starting the next - so maybe you are in effect drawing your two objects together and it's joining the last vertex of the first object to the first vertex of the second, which is causing all the polygons to contain the wrong vertex list, so they render incorrectly or not at all.
  • elect's avatar
    elect
    Honored Guest
    "rtweed" wrote:
    I cannot shed much light on exactly what's gone wrong or why, but from the looks of it, the surface normals have been flipped and/or the order of the vertices has been reversed so the polygons have the wrong "handedness" (clockwise vs anticlockwise). If you can turn off backface culling, you might see more stuff rendered which could give you a better idea what's going on.

    I've seen things like this happen when running a batch of draw commands, but not closing off one group before starting the next - so maybe you are in effect drawing your two objects together and it's joining the last vertex of the first object to the first vertex of the second, which is causing all the polygons to contain the wrong vertex list, so they render incorrectly or not at all.


    OMG I found it!!

    Oh man, you were right, the fu''ng error was

    gl3.glDrawElements(GL3.GL_TRIANGLES, indices.size(), GL3.GL_UNSIGNED_INT, ibo[0]);

    instead

    gl3.glDrawElements(GL3.GL_TRIANGLES, indices.size(), GL3.GL_UNSIGNED_INT, 0);

    I wrote in the last field, offset, the value of the ibo index.. shame on me

    Now the "only" problem is the lighting on the solid geometry..
  • elect's avatar
    elect
    Honored Guest
    So, I think to be in a pretty good position to say "I made it"

    38.png

    What is left is fixing lighting on non-texture object (but I dont care at the moment) and flip it...

    It is strange, because dumping the content of the off-screen framebuffer texture I get it right, but when I distort on the monitor it I get it flipped

    If I rotate the oculus up and down in the world space it rotates properly, left/right instead is flipped

    So where am I supposed to modify the code?
  • elect's avatar
    elect
    Honored Guest
    Tried inverting the ScreenPosNDC.y but I get this, take care about the lines at the corners, they gets corrupted

    39.png
  • I'm workin on OpenGL Oculus support too.
    Fix for your upsidedown problem is here:
    viewtopic.php?f=34&t=12373
    In short do this:

    float2 scale = float2(settings[i].UVScaleOffset[0].x, settings[i].UVScaleOffset[0].y);
    float2 offset = float2(settings[i].UVScaleOffset[1].x, settings[i].UVScaleOffset[1].y);

    // (!) OpenGL
    scale.y = -scale.y;
    offset.y = 1.0f - offset.y;

    eyeToSourceUVScale.set(scale);
    eyeToSourceUVOffset.set(offset);
  • elect's avatar
    elect
    Honored Guest
    "mrkaktus" wrote:
    I'm workin on OpenGL Oculus support too.
    Fix for your upsidedown problem is here:
    viewtopic.php?f=34&t=12373
    In short do this:

    float2 scale = float2(settings[i].UVScaleOffset[0].x, settings[i].UVScaleOffset[0].y);
    float2 offset = float2(settings[i].UVScaleOffset[1].x, settings[i].UVScaleOffset[1].y);

    // (!) OpenGL
    scale.y = -scale.y;
    offset.y = 1.0f - offset.y;

    eyeToSourceUVScale.set(scale);
    eyeToSourceUVOffset.set(offset);


    viewtopic.php?f=20&t=11149&p=166554#p166554

    It looks the same as I inverted the ScreenPosNDC.y and they both look fine, anyway I am gonna stay with the fix you suggested :)

    Ps: I wrote in your thread mine values
  • elect's avatar
    elect
    Honored Guest
    One question, I am trying to implement picking through rendering each object with a specific id value on an offscreen framebuffer

    When I pick, I restore the right glViewport, and then I use the view matrix I calculated the frame earlier just before offsetting it, but now I am stuck on the projection..

    I calculate it like this

    private Mat4 createProjection(FovPort tanHalfFov, float zNear, float zFar) {

    ScaleAndOffset scaleAndOffset = createNDCscaleAndOffset(tanHalfFov);

    float handednessScale = -1f;

    Mat4 projection = new Mat4();

    projection.c0.x = scaleAndOffset.scale.x;
    projection.c1.x = 0f;
    projection.c2.x = handednessScale * scaleAndOffset.offset.x;
    projection.c3.x = 0f;

    projection.c0.y = 0f;
    projection.c1.y = scaleAndOffset.scale.y;
    projection.c2.y = handednessScale * -scaleAndOffset.offset.y;
    projection.c3.y = 0f;

    projection.c0.z = 0f;
    projection.c1.z = 0f;
    projection.c2.z = -handednessScale * zFar / (zNear - zFar);
    projection.c3.z = (zFar * zNear) / (zNear - zFar);

    projection.c0.w = 0f;
    projection.c1.w = 0f;
    projection.c2.w = handednessScale;
    projection.c3.w = 0f;

    return projection;
    }

    private ScaleAndOffset createNDCscaleAndOffset(FovPort tanHalfFov) {

    float projXscale = 2f / (tanHalfFov.LeftTan + tanHalfFov.RightTan);
    float projXoffset = (tanHalfFov.LeftTan - tanHalfFov.RightTan) * projXscale * .5f;
    float projYscale = 2f / (tanHalfFov.UpTan + tanHalfFov.DownTan);
    float projYoffset = (tanHalfFov.UpTan - tanHalfFov.DownTan) * projYscale * .5f;

    return new ScaleAndOffset(new Vec2(projXscale, projYscale), new Vec2(projXoffset, projYoffset));
    }


    And I am calling it from the eye loop

    proj = createProjection(eyeRenderDescs[eye].Fov, .01f, 10000f);


    With the following values

    uvScaleOffset[leftEye] (0.15114091, 0.17442882) (0.35465688, 0.5)
    uvScaleOffset[rightEye] (0.15114091, 0.17442882) (0.6453431, 0.5)


    I cant figure it out how translate these values to a standart yFov and aspect

    private static Mat4 perspectiveRH(float yFov, float aspect, float zNear, float zFar) {

    float frustumScale = calculateFrustumScale(yFov);

    Mat4 perspectiveRH = new Mat4(0);

    perspectiveRH.c0.x = frustumScale / aspect;
    perspectiveRH.c1.y = frustumScale;
    perspectiveRH.c2.z = zFar / (zNear - zFar);
    perspectiveRH.c2.w = -1;
    perspectiveRH.c3.z = zFar * zNear / (zNear - zFar);

    return perspectiveRH;
    }
  • elect's avatar
    elect
    Honored Guest
    So if I check the values of

    eyeRenderDescs[eye].Fov


    By

    System.out.println(eyeRenderDescs[eye].Fov.LeftTan + " " + eyeRenderDescs[eye].Fov.RightTan
    + " " + eyeRenderDescs[eye].Fov.UpTan + " " + eyeRenderDescs[eye].Fov.DownTan);


    I get

    2.3465312 0.9616399 2.8664987 2.8664987
    0.9616399 2.3465312 2.8664987 2.8664987


    Lets take the UpTan

    angleRad = atan(2.8664987)

    angleRad = 1.2351395544273056

    angleDeg = 70.76828357835365

    70° only for the upper part seems a little too much to me, considering that we still have to double, it'd be a total of over 140° for the vertical angle...

    Have I done something wrong? Is it correct to calculate it that way?