I've been developing with Oculus Go. I noticed that any swipes on the Go touchpad (without clicking it) are triggering Input.GetMouseButtonDown(0) events in Unity.
That certainly seems unexpected to me. The documentation doesn't seem to suggest this should be the case. This page suggest best practices for handling swipes, but doesn't list them as recognized gestures. Are they officially recognized? (https://developer.oculus.com/documentation/unity/latest/concepts/unity-ovrinput/#unity-ovrinput)
I wonder if this is responsible for a lot of the finicky behavior I've experienced and that I've seen others report on the forum as well. Swiping in the Oculus Go menu and in certain apps (like Netflix) is awful because it is unpredictable and often results in a click (which the user sees as opening a show/movie when they were just trying to swipe through a menu). Perhaps this is why?
@imperativity thank you for the replies. It's not intuitive but I was able to work around it.
I made sure that mouse events are ignored when I build for the Go. Otherwise, leaving my input handlers for mouse events in the VR/Go build was causing really inconsistent behavior since the trigger, touchpad and swipes all trigger the mousedown event too.
Now I can develop on PC (using mouse input) and test builds on the Go periodically (using OVRInput) without the two conflicting.