Just started working with oculus touch and am having issues mapping a button to submit clicks for UI interface. It doesn't play nice with input manger. any one run into this? I am sure I am doing something dumb here.
The Utilities don't directly implement UGUI interactions. Check out https://developer3.oculus.com/blog/unitys-ui-system-in-vr/ for sample code that does. You can use OVRInputModule to enable gaze-based raycasting. Unfortunately, it hasn't been updated to support Touch yet. That will be added soon.
Thanks for the quick reply . I am now trying to adopt code used in my HTC port of the game which uses a laser pointer as a raycast to interact. Lets hope heh