Forum Discussion
Jyakku
11 years agoHonored Guest
Native Rendering Plugin with Oculus Rift
I'm working on a project that offloads some rendering to a native plugin I wrote for Unity, in order to make use of instancing and other advanced graphics features. I'm developing it for a cross-platform release, but I work with a Mac so testing is done primarily with OpenGL. The plugin works as expected in a blank Unity project, but as soon as I incorporate it into my Oculus project, it begins behaving erratically.
In the Rift, the plugin's geometry draws twice, one time stretching across both eyes and another time drawing only within the bounds of the right eye. Additionally, any primitive colors I apply to the geometry are lost and the geometry appears to pick up surrounding colors; on a black screen with red text, the geometry will be mostly black with some red bleeding into the lines. As soon as my green terrain is loaded, the geometry drawn by the plugin becomes green.
To avoid bogging down the forum with screenshots and code samples, I've posted them externally in a StackOverflow question that you can find here: http://stackoverflow.com/questions/26594208/native-rendering-plugin-with-oculus-rift. Any insight from experienced native plugin/Rift graphics coders would be appreciated!
In the Rift, the plugin's geometry draws twice, one time stretching across both eyes and another time drawing only within the bounds of the right eye. Additionally, any primitive colors I apply to the geometry are lost and the geometry appears to pick up surrounding colors; on a black screen with red text, the geometry will be mostly black with some red bleeding into the lines. As soon as my green terrain is loaded, the geometry drawn by the plugin becomes green.
To avoid bogging down the forum with screenshots and code samples, I've posted them externally in a StackOverflow question that you can find here: http://stackoverflow.com/questions/26594208/native-rendering-plugin-with-oculus-rift. Any insight from experienced native plugin/Rift graphics coders would be appreciated!
4 Replies
Replies have been turned off for this discussion
- vrdavebOculus StaffIt looks like you are rendering directly to the backbuffer. Instead, try rendering to Camera.targetTexture.GetNativeTexturePtr() for each of the left and right eye cameras. You should be able to issue plugin events to do that in the OnPostRender event for each camera.
- JyakkuHonored GuestThanks so much for the suggestion, Dave. Unfortunately, when I try to pass the native texture pointer of either of the eye cameras to the plugin in the OnPostRender function, Unity stops responding and I have to force the application to quit. I've verified that writing to another texture in this same way works, but when trying to write to the OVRCameraRig's eye cameras, the freeze occurs every time. I am using Unity 4.5.5; is this a known issue, or do you have any ideas about what I may be doing wrong?
- JyakkuHonored GuestBumping because I've tried several methods with the same result, and I can't be the only person out there using a native rendering plugin with the Rift. I tried upgrading to the new OVR SDK, but still no dice. Any ideas?
- vrdavebOculus StaffDo you have a call stack for the hang? Is your code getting called? OnPostRender should work, but you might want to try modifying the textures in the OnRenderImage(..) callback instead. But watch out, Unity uses different RenderTextures at that point and you should be careful to read and write the correct ones in your plugin. We don't do anything special with the textures between OnWillRenderObject and OnRenderImage, but remember that Unity has separate threads for the mono and the graphics device. You will need to use GL.IssuePluginEvent to queue up operations for your plugin, instead of making calls directly from C#.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 year ago
- 2 years ago