Forum Discussion
Anonymous
10 years agoHow can get the correct camera matrix in a native render plugin?
I want to mix the rendering objects in by a Unity render plugin, such as below:
the white box is rendered by Unity, and the colored terrain is rendered in the native plugin. As the screenshot shows, they are all at the position(0, 0, 0)
when I turn on VR rendering(Oculus), terrain is rendered in both eye's view, but the transform of the terrain mesh is wrong, and moving with oculus's rotation. I guess the problem is caused by view matrix.
here is the code I used in unity. the component is attached to a camera, and native render plugin is called(or queued) in the OnPostRender() method.
So the question is, how can I render the terrain to the correct position in Oculus?
the white box is rendered by Unity, and the colored terrain is rendered in the native plugin. As the screenshot shows, they are all at the position(0, 0, 0)
when I turn on VR rendering(Oculus), terrain is rendered in both eye's view, but the transform of the terrain mesh is wrong, and moving with oculus's rotation. I guess the problem is caused by view matrix.
here is the code I used in unity. the component is attached to a camera, and native render plugin is called(or queued) in the OnPostRender() method.
So the question is, how can I render the terrain to the correct position in Oculus?
9 Replies
Replies have been turned off for this discussion
- vrdavebOculus StaffThanks, this seems to be a bug in Camera.worldToCameraMatrix. Your code looks correct and I've reported this to Unity. For now, you can construct your own view matrix using (untested):var m = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, new Vector3(1, 1, -1));var worldToCameraMatrix = m * transform.worldToLocalMatrix;
If that doesn't work, we have some code in our utility script OVROverlay.cs that uses UnityEngine.VR.InputTracking to build up similar pose information. See https://developer.oculus.com/downloads/game-engines/1.3.0/Oculus_Utilities_for_Unity_5 . - Anonymous
vrdaveb said:
Thanks, this seems to be a bug in Camera.worldToCameraMatrix. Your code looks correct and I've reported this to Unity. For now, you can construct your own view matrix using (untested):var m = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, new Vector3(1, 1, -1));var worldToCameraMatrix = m * transform.worldToLocalMatrix;
If that doesn't work, we have some code in our utility script OVROverlay.cs that uses UnityEngine.VR.InputTracking to build up similar pose information. See https://developer.oculus.com/downloads/game-engines/1.3.0/Oculus_Utilities_for_Unity_5 .
sorry, it looks like the same with Camera.worldToCameraMatrix - AnonymousIf I change the GL.GetGPUProjectionMatrix(camera.projectionMatrix, false) to GL.GetGPUProjectionMatrix(camera.projectionMatrix, true), the position looks the same with VR rendering result. So the problem is caused by projection matrix, not view matrix.
- vrdavebOculus StaffUnity must be interpreting the backbuffer in OpenGL conventions, with 0,0 at the bottom left of the texture but treating RenderTextures in D3D conventions, with 0,0 at the top left. When you enable VR, Unity renders to RenderTextures for both eyes instead of rendering directly to the back-buffer. It still seems like code that works for the backbuffer should also work for Unity's automatically-managed eye buffers, so we'll continue to look at this. Is your current code working for you then?
- Anonymous
vrdaveb said:
Unity must be interpreting the backbuffer in OpenGL conventions, with 0,0 at the bottom left of the texture but treating RenderTextures in D3D conventions, with 0,0 at the top left. When you enable VR, Unity renders to RenderTextures for both eyes instead of rendering directly to the back-buffer. It still seems like code that works for the backbuffer should also work for Unity's automatically-managed eye buffers, so we'll continue to look at this. Is your current code working for you then?
Yes, it works now, just changed the projection matrix. Now the code is like this:
var projectionMatrix = GL.GetGPUProjectionMatrix(Camera.current.projectionMatrix, VRSettings.enabled); - Anonymous
vrdaveb said:
Unity must be interpreting the backbuffer in OpenGL conventions, with 0,0 at the bottom left of the texture but treating RenderTextures in D3D conventions, with 0,0 at the top left. When you enable VR, Unity renders to RenderTextures for both eyes instead of rendering directly to the back-buffer. It still seems like code that works for the backbuffer should also work for Unity's automatically-managed eye buffers, so we'll continue to look at this. Is your current code working for you then?
after I upgrade to Oculus Runtime 1.3 and Untiy5.3.4p1, it does not work again... - Anonymousalthough the native rendering objects can be seen in Oculus, but the "projection" seems not right.
you can see the blow images, if I move from far to near, the "native" terrain object looks like moved a little at the same time. - cyberealityGrand ChampionWe still need to do some tests in order to reproduce this issue here. In the meantime, you may want to look at the OVROverlay.cs code, as it does similar things. Hope that helps.
- Anonymous
OVROverlay is a screen-space rendering effect, the problem above is a world-space rendering problem, so I cannot find any useful information in OVROverlay.cscybereality said:We still need to do some tests in order to reproduce this issue here. In the meantime, you may want to look at the OVROverlay.cs code, as it does similar things. Hope that helps.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 7 months ago