cancel
Showing results for 
Search instead for 
Did you mean: 

OpenGL: truncated render textures? (32bit->24bit) (SDK 4.1)

DiCon
Honored Guest
Edit: Changed the title to include further observations (second post).

I am quite happy to have almost successfully integrated the SDK into my own OpenGL engine - except for one problem, which took my at least six hours to narrow it down to the following:

As my engine is a deferred shader, I first render to some geometry buffers, using the RGBA channels of full-resolution textures as efficiently as possible. There are two relevant cases in which I use the alpha channel to store geometry data: Once for the reflectivity and once for the "roughness" (for environment maps and the specular lighting as well as the specular exponent).

However, in a later render step (the actual shading), I only get zero when I try to retrieve the alpha data, while RGB is working fine. So this means, I can see the hole scene, but the shading has no specular lighting or any reflections. When I explicitly replace the alpha channel with "1." in the shader, my lighting works perfectly (except that now every object is excessively shiny).

Also note, that I created my code so, that I can compile it easily without libOVR and in this case, everything works fine (of course, there are some other lines, which change if I do so).

tl;tr
Right now I am suspecting, that the SDK changes an OpenGL render state, which prevents either writing or reading the alpha channel of my framebuffer textures. After checking a lot of glintercept-data, I still cannot find the problem (it's not glColorMask).

Any ideas, what can cause this? Any obscure OpenGL states?
1 REPLY 1

DiCon
Honored Guest
Looks like I have to give up and hope that the next SDK with its many OpenGL fixes will magically solve the problem...

Some additional observations in case anybody is interested in this problem:

  • I get some depth fighting when using the SDK, which I do not get without it. My scene has a huge depth range and I use 32bit depth buffers in the first rendering step. So could it be, that all render textures are clipped to 24bit? RGBA becomes RGB and 32bit depth is limited to 24bit? Sounds absurd to me, but the problems are quite suggestive...

  • Everything works fine, if I do the buffer swap myself without calling endFrame. (Of course, I render to the back buffer instead of a texture target, but this is in the last render step, which does HDR tone mapping in screen space and is hence unrelated to the problems I have with the geometry buffer.)