Forum Discussion
sprocket
11 years agoHonored Guest
Oculus and Unity 4.6 UI?
Hi,
I am curious regarding if anyone has had success with combining OVR rendering with the new UI system? Specifically, I want to replicate the HUD-like UI we have where everything is drawn in front of other 3D geometry. This is also crucial for things like crosshairs that you want rendered at an long distance even if there are occluding geometry.
I have been unsuccessful using custom shaders for the new UI system, and the proposed solution of having 2 extra cameras for rendering the UI seems impractical and cumbersome.
Any ideas? Thanks in advance!
I am curious regarding if anyone has had success with combining OVR rendering with the new UI system? Specifically, I want to replicate the HUD-like UI we have where everything is drawn in front of other 3D geometry. This is also crucial for things like crosshairs that you want rendered at an long distance even if there are occluding geometry.
I have been unsuccessful using custom shaders for the new UI system, and the proposed solution of having 2 extra cameras for rendering the UI seems impractical and cumbersome.
Any ideas? Thanks in advance!
7 Replies
Replies have been turned off for this discussion
- AnonymousI'm just getting started on this myself, but I'm following this post & code: http://ralphbarbagallo.com/2014/08/20/oculus-rift-world-space-cursors-for-world-space-canvases-in-unity-4-6/
- sprocketHonored GuestThat page does not really adress the rendering issue, but I solved that quite easily.
Basically, the way to get a HUD-like UI to always render on top of "real"/solid geometry is to assign a custom material to the UI element(s) in question. The shader for that material can employ the same simple trick as we used with NGUI, namely assigning the shader to the overlay queue. Voila, you will have a world space canvas with elements that will always render on top of everything without need for additional cameras. :) - AnonymousAhh right you are, sorry, I jumped straight to uGUI in VR in general because that's the Google-fu that brought me here. Indeed, renderqueue is the solution for the top-rendering you described.
- sprocketHonored GuestNo worries, the link you posted is very good as that is something one has to deal with if you want to use a view-driven input. T
he render queue thing is quite obvious yes, but the fact that you can control the UI rendering through custom materials is a bit hidden in the current uGUI docs. Even direct questions on the Unity forums did not yield any info, so even though all docs on uGUI deal with stuff like render layers and order by hierarchy it is possible to easily override all that on a per-object basis with a custom material. Which is good. :) - pietwelveHonored GuestHi,
Didn't know about the "assigning the shader to the overlay queue" trick, thanks !
My solution was to duplicate the Left and Right Cam, using Depth and Culling Mask, it works but your solution seems far better.
But I have an issue,don't know if it's related :
My world space canvas is the son of my LeftCam. My uGUI Buttons do interact with the mouse cursor but the alignment betweel the two is wrong. (I have to clic outside of the button and not on the button)
Any clue ?
Please excuse my english
Thanks - sprocketHonored GuestWe have not used mouse driven input yet with uGUI in the rift, but if it is like NGUI (and I assume it to be):
- Make sure the UI camera is the same as the one you have the canvas parented to
- Realise that the distance between the UI and the cameras will have direct impact on the size of the error. Ie: if the UI is relatively close to your viewpoint the error will be fairly large.
A handy technique if you are building a HUD-like UI is to make it big and quite far from the cameras. This is where the render queue and custom shader/material comes into play as you can place the UI far off and behind all other stuff and still have it render in front. The UI experience will then mimic real HUDs which are often calibrated to enable focus at or near infinity.
Edit: Forgot to mention: If the above does not work in your situation a final (and powerful) solution is to write a new input module for uGUI. It seems relatively straightforward and will enable you to dictate exactly where input events are sent based on your own custom raycasts or cursor tracking. There are examples and resources available on the Unity forums (4.6 open beta section). - pietwelveHonored GuestThanks a lot for your answer :-)
Adding "ZTest always" in the UI elements Shaders, now the UI displays Over everything. So I could remove my UIcameras and keep only the two native Oculus Left/Right Cam.
My "Event camera" is the same as the one the canvas is parented to (the left camera).
But the error (offset between mouse cursor and actual button position) seems roughly constant whatever distance the canvas is from the camera... that's strange ...
So as you said i will have to find a new input module, i don't have the skill to write it myself :-(
For now the only one i've read about is for "world space VR cursor", i guess it won't fit.
I'll keep searching.
Thanks again
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago