Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
tcarothers's avatar
tcarothers
Honored Guest
9 years ago

GL: Why does querying OpenGL internal tex format for OVR texture swap chain's color tex return 0?

I added the following code to the OculusRoomTiny(GL) application within sdk 1.10.1:

void LogTextureFormat(TextureBuffer *const rt, const std::string& title)
{
    GLint colorTex = rt->GetCurrentTexId();
    glActiveTexture(GL_TEXTURE0);
    glBindTexture(GL_TEXTURE_2D, colorTex);

    int assignedWidth, assignedHeight, assignedFormat;
    glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, &assignedWidth);
    glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_HEIGHT, &assignedHeight);
    glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_INTERNAL_FORMAT, &assignedFormat);

    int redSize, greenSize, blueSize, alphaSize;
    glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_RED_SIZE, &redSize);
    glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_GREEN_SIZE, &greenSize);
    glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_BLUE_SIZE, &blueSize);
    glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_ALPHA_SIZE, &alphaSize);

    if (glGetError())
    {
        exit(1);
    }

    using namespace std;

    fprintf(stderr, "%s: Assigned texture with [id, width / height / format] : [%d / %d / %d / %d] and rgba sizes : [%d, %d, %d, %d]\n",
        title.c_str(), colorTex, assignedWidth, assignedHeight, assignedFormat, redSize, greenSize, blueSize, alphaSize);
}

I called this function within the render loop and the output I got was:
PreRender: Assigned texture with [id, width / height / format] : [1 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [5 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [2 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [6 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [3 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [7 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [1 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [5 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [2 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [6 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [3 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
PreRender: Assigned texture with [id, width / height / format] : [7 / 1332 / 1586 / 0] and rgba sizes : [8, 8, 8, 8]
...

Note that glGetError() was NOT triggered. It's correctly fetching all the other texture attributes, just not GL_TEXTURE_INTERNAL_FORMAT. Why?

2 Replies

  • Hey it's been a couple weeks, any thoughts on this would be much appreciated.
  • "If you do the query later after the texture is used then it reports a valid value. Can you try this?"
    Didn't help. I added a query after the Render() AND after the Commit() and it still has the same behavior. Let me know if you want me to try anything else.