Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
RyanSpicer's avatar
RyanSpicer
Honored Guest
11 years ago

DirectX11 Geom. Shaders, MVP matrix, OVRCameraController?

I'm working on a project that uses compute shaders and geometry shaders to generate procedural geometry at runtime. My material uses a Geometry Shader, and is drawn like so:

void OnRenderObject()
{
if (!enabled)
return;

Matrix4x4 m = transform.localToWorldMatrix;
Matrix4x4 v = Camera.current.worldToCameraMatrix;
Matrix4x4 p = Camera.current.projectionMatrix;

Matrix4x4 MVP = p * v * m;

Graphics.ClearRandomWriteTargets();
proceduralMaterial.SetPass(0);
proceduralMaterial.SetMatrix("ModelViewProjection", MVP);

Graphics.DrawProcedural(MeshTopology.Points, 640 * 480);
Graphics.ClearRandomWriteTargets();
}


When viewing the scene with a regular Unity camera or in the editor during preview, this works great -- I see my procedurally-generated object in the world, and it is positioned correctly. When I move the camera around in the world, the generated object remains in the correct orientation relative to the world.

As soon as I produce a build with the OVRCameraController enabled, things fall apart. It looks like the model-view-projection matrix is rotated incorrectly -- pitching and rolling the HMD seems to cause rolling and pitching of the procedurally-generated object, and I think I'm seeing the back of it rather than the front, based on the little bits of geometry I can see.

Is this a known issue, or is there a known workaround? Only thing I can think of is that the OVRCameraController is generating extra calls to OnRenderObject, or draw calls with Camera.current not set to what I expect..

[edit] Hold up here, I just tried rendering the scene from two views (two Unity cameras with modified viewport rectangles), and that got "interesting" in unpleasant ways. I think I've got a general problem, not an OVR-specific problem..[/edit]

[edit 2] It looks like for some reason the result of DrawProcedural() is being called after the end of the current camera's rendering, so that it ends up with Camera A's MVP matrix being used when DrawProcedural happens for Camera B, and vice versa. If I add a third camera, everything's off-by-one in the same way (A -> B, B->C, C->A). Understandable based on GPU pipelining, but frustrating..[/edit]

4 Replies

Replies have been turned off for this discussion
  • Okay, posting a reply here in case anyone ever comes across this kind of problem again.

    My approach was to use OnRenderObject() method that calculates the MVP matrix for that object as seen by the current camera. That MVP matrix is then set that as a Matrix parameter of a material, and Graphics.DrawProcedural() is then used to draw an object. The material has a geometry shader that generates some procedural geometry and projects it based on the MVP matrix -- it appears that I can't use Unity's builtins from unitycg.cginc in this context.

    It appears that problem #1 was that my 2nd camera would update the matrix on the GPU before my 1st camera had finished drawing. Unclear if that's the exact mechanism, but it appears to be something along that line.

    My solution was to instantiate copies of my procedural material, one per camera. These were stored in a Dictionary in the monobehaviour and then called up depending on Camera.current during OnRenderObject().

    NEXT CHALLENGE!

    This works great with two regular Unity cameras, but behaves very strangely when viewed through the DK2 cameras. Yaw appears "correct," but the procedural mesh appears to be pitched and rolled twice as quickly as the rest of the Unity world, when viewed through the DK2. Also, I seem to be looking at the mesh from behind, and the colors are off. I'm guessing some of the mesh errors may be caused by https://developer.oculusvr.com/forums/viewtopic.php?f=37&t=12406 (Compute Shaders broken in DX11).

    Would it be helpful to put together a minimum functioning example, VRDave (or other OVR folks)? I can do it but would prefer not to unless it will be used for testing.
  • Ok, one further reply. Turned out to be pretty easy to make a minimum functioning test case for the bug. Please find attached an example Unity 4.5 project demonstrating the bug.

    Steps to reproduce:

      Open Assets/Scenes/TestCase.unity
      Observe that on pressing play, a yellow rectangle is rendered in the center of the ground plane.
      Import the 0.42 OVR Unity Integration assets (not included here to avoid any potential distribution beyond license terms)
      Parent OVRCameraController to the "Parent OVRCameraController here" empty gameobject.
      Build -- observe that the yellow rectangle is not visible.


    Not quite sure what's going on here, but any assistance would be appreciated, whether that's "you're doing something fantastically silly in your shader" or "this is actually a driver/integration bug, we'll test against this case."
  • Just wanted to leave a followup here in case anyone stumbles upon this in Google. This is not actually a bug in the OVR SDK or Unity integration. After talking with some other developers who've done procedural geometry, I determined that my error stemmed from trying to manually calculate the MVP matrix rather than leveraging UNITY_MATRIX_* constants. After reworking my code, bug fixed. The working approach was to specify the object's model-to-world matrix as a shader uniform, and mul() that with UNITY_MATRIX_VP in the shader. I also switched to calling the shader from OnPostRender() on the camera, not OnRenderObject() on the placeholder object. Not sure if this change was strictly required, but it fixed the erroneous behavior.
  • Hey Ryan,

    Did you ever figure out a way to get DrawProcedural() to happen during the normal render queue? From my position, it doesn't seem like it would be difficult to get that buffer to render at any point in the queue, I don't see why it NEEDS to be at the very end.