cancel
Showing results for 
Search instead for 
Did you mean: 

Using Oculus Quest controller for clicking on Buttons

GekaW
Honored Guest
Hi folks,

I'm currently building a prototype for a project (using unity 2019.4.0f1). That means it doesn't have to be perfect and there are just a few functions involved.
So far I created the environment and added a dialogue system, which pops up when the button is hit. This is working absolutely fine when I'm in Unity and try it on play mode. But when I try it out on my Oculus Quest, everything is shown correctly but I'm not able to click the button (as I did in play mode with the mouse). 
I'm using the custom hands and are able to grab things - but it seems I'm too stupid to click buttons. My brain tells me that there is a simple solution, but right now I'm more than desperate. I hope there's someone out there, who can help me. Btw I'm a newbie and try my best - be kind. 

Thank you in advance!

I don't know if it helps but here are my Hierarchy...
s1yhvha4br6u.png

The button in the inspector
5qty2l2gi5uq.png
...and the button in game
5gg4dgj5nlew.png
5 REPLIES 5

Fracont
Explorer
Take a look at the DebugUI scene in Oculus\SampleFramework\Usage for how to trigger UI events with VR controllers

rib
Protege
Unity UI input in VR is actually surprisingly difficult, so it's definitely not that "you're too stupid to click buttons"!

Unity's UI system is fundamentally designed for 2D input on a screen, with something like a mouse or touch input so all the solutions for VR UI input essentially have to send Unity fake screen coordinates.

Unity uses something called an Input Module for dealing with this stuff which is normally attached to your "EventSystem" game object in your scene. In addition to the Input Module, Unity's UI input also requires a raycaster to be attached to each Canvas in your scene that you want to interact with (normally a "Graphics Raycaster").

For VR UI input you generally have to replace both of these things.

The Oculus Integration Asset includes one limited, old solution for this in the form of an "OVR Input Module" that you can add to your EventSystem, and also an "OVR Raycaster" that you can add to each canvas (replacing the original InputModule and Graphics Raycaster).

There are then a few properties you need to make sure are setup, including setting the Event Camera for your wold-space canvas to be your CenterEyeAnchor from your OVRCameraRig. (The OVR Raycaster will make VR input look like pointer input from your headset screen.)

You will want to make something that points along the Z axis and attach it to one of your controllers so you can see where you are pointing and this should then be set as the "Ray Transform" on the OVR Input Module and also as the "Pointer" for the OVR Raycaster. For the Cursor property of the OVR Input Module I just tested and found I had to make a script that implements the OVRCursor abstract class, but this can just be an stub implementation that does nothing.

With all that faff it should support clicking your button but only with the 'A' button of your controller, not with the trigger, grip buttons.

There's probably an asset that helps with this stuff, better than this old input module but I'm not sure what to recommend. In my case I ended up writing a custom input module/ray caster after hitting a number of limitations in different alternatives I found. I'd be happy to share this at some point but that's probably not going to be helpful here for now.

One asset I've tried in the past that worked fairly well was the Curved UI asset and maybe that's at least a better starting point than the solution from Oculus - although it's not free.

Hope some of that information helps point you in a useful direction 🙂

Fracont
Explorer
Another option could be Microsoft Mixed Reality Toolkit (MRTK) downloadable from Github. 
For Quest you need MRTK-Quest, too.
I can't post links, sorry

GekaW
Honored Guest
Thank you two for your answers and ideas, I really appreciate it! Especially @rib for taking the time and answer this elaborately.
Actually after 1,5 weeks looking for an answer someone came up with a super simple answer and I think it's a shame that one needs to look so long for this simple solution: Besides the "OVRPlayerController" I added "UIHelpers". I disabled my old EventSystem (as UIHelpers brings its own) and changed the event camera of my canvas to the centereyeanchor. Additionally I disabled the graphic raycaster at the canvas and added the OVR Raycaster script, where the pointer is the laserpointer from the UIHelpers. And that's it. So simple and yet so hard to find.

oskey.11
Honored Guest

haii, can i know the code for start button that makes it trigger the dialogue?