DonCon
5 years agoExplorer
16 Bit Depth Buffer
Hi there !
The documentation states that it would be beneficial to use a 16bit depth buffer and no stencil, but doesnt tell us how to achieve this.
Does anybody know how ?
I am using Unity 2019.4 with URP 7.4 & Oculus Quest 1
I tried brute forcing the depth buffer to 16 bit by editing in UniversalRenderPipelineCore.cs function:
The documentation states that it would be beneficial to use a 16bit depth buffer and no stencil, but doesnt tell us how to achieve this.
Does anybody know how ?
I am using Unity 2019.4 with URP 7.4 & Oculus Quest 1
I tried brute forcing the depth buffer to 16 bit by editing in UniversalRenderPipelineCore.cs function:
static RenderTextureDescriptor CreateRenderTextureDescriptor(Camera camera, float renderScale,
bool isStereoEnabled, bool isHdrEnabled, int msaaSamples, bool needsAlpha)
Just add this line before return : desc.depthBufferBits = 16; // Override to 16 bit depth
And it clearly works in Editor as i get a lot more z-fighting with big camera clipping planes, but when i use ovrgpuprofiler or renderdoc it still shows me DepthBuffer with 24bit and 8bit Stencil.
I really hope someone from Oculus chimes can help with this!
Best regards,
Tim
Just add this line before return : desc.depthBufferBits = 16; // Override to 16 bit depth
And it clearly works in Editor as i get a lot more z-fighting with big camera clipping planes, but when i use ovrgpuprofiler or renderdoc it still shows me DepthBuffer with 24bit and 8bit Stencil.
I really hope someone from Oculus chimes can help with this!
Best regards,
Tim