Getting Both touch controllers to interact with Unity UI
I have one touch controller working with Unity UI. I'm was using the HandedInputSelector from Oculus/SampleFramework/Code/DebugUI/Scripts/HandedInputSelector.cs It only supports one controller at a time. I'd like to support 2 controllers. What do I need to do to get both controllers to manipulate the UI? What I thought might work is I disabled the HandedInputSelector script. Then on the EventSystem object that had an OVRInputModule I added a second OVRInputModule. The first has its rayTranform set to the right hand, the other to the left. For the left I duplicated the LaserPointer and assigned the "cursor" property of the second OVRInputModule to that LaserPointer duplicate. The original LaserPointer has a LaserPointer script with a Cursor Visual reference to a sphere. So I duplicated the sphere as well and pointed the second LaserPointer to the 2nd sphere. And, nothing. No pointer comes out of the second controller and no interaction. What should I do to get both controllers to manipulate UI? I'm on Unity 2019.3.6 with the 14.0 version of the Oculus asset from the unity asset store (2020 March 14th)921Views0likes0CommentsDeveloper Automation - How to bypass the "Controllers Required"
I've taken a while to setup quest automation as part of our build process as a sanity test with end to end functionality. The test works by pushing apk to device and then running a scene that teleports around and does a few things without the need for the touch controllers (however we had to figure out how to enable the proximity sensor as I couldn't find adb commands to keep the screen on during the tests). After a recent Oculus Quest update within the past few months (I was out on vacation), our automation is failing to launch our test scene in the apk because Oculus Home is showing a "Controllers Required" dialog that waits for controllers to be active. I'm trying to figure out if there is a way to bypass this. For exact wording, the dialog is: "Controllers Required" "Please pick up your controllers and select Continue to use this app." "Cancel" "Continue" 1. I was hoping there would be a to disable this similarly to how we can disable the guardian from the developer menu but no luck. Can this be added or is there another recommended way to get around this? 2. Is there something that can be set in the manifest (from unity) to disable this requirement? I haven't looked into sdk changes of the unity plugin yet and will do so. I'm hoping that this isn't required though as that means I can never run older builds of our app for perf testing. Thanks, ~Kevin3KViews0likes1CommentOVRPlayerController doesn't allow player to move with controllers after build
When the game is launched to show on the Oculus Quest, it doesn't allow the player to move at all with the controllers. The OVRPlayerController has ForwardDirection and the OVRCameraRig in it, so I'm not sure what the issue is. I am using Unity 2018.3.5f1 and Oculus Integration 1.39 from the Unity asset store. Any help would be great!2.1KViews0likes3CommentsOVRInput.GetConnectedControllers() returns RTrackedRemote, while only RTouch controller is connected
I'm developing an application for the Oculus Quest with Unity 2018.4.2f1, and OVRInput was working just fine reading controller inputs from the Quest's Touch controllers. Then all of the sudden, with no obvious cause, OVRInput.Get() and OVRInput.GetDown() stopped behaving correctly, messing up my control scheme. For example, OVRInput.Get(OVRInput.Button.One), OVRInput.Get(OVRInput.Button.PrimaryIndexTrigger), and OVRInput.Get(OVRInput.Button.PrimaryHandTrigger) all have the expected behavior - returning true whenever the A button, index trigger, and hand trigger are currently being pressed, respectively. However, OVRInput.Get(OVRInput.Button.Two) no longer responds correctly - instead of being true whenever the button is depressed and false otherwise, pressing the button once sets the value to true permanently. OVRInput.Axis2D.PrimaryThumbstick no longer gives any data. Digging into it a little more, it seems that my Oculus Touch controller is being confused for an Oculus Go controller. I have one controller connected currently - the right Oculus Touch controller, which shows up fine in the headset menus, but when I check the value of both OVRInput.GetActiveController(), and OVRInput.GetConnectedControllers() within my Unity app, both return RTrackedRemote, instead of RTouch, which is the only controller connected to the headset. Additionally, attaching a OVRControllerPrefab to the RightControllerAnchor of my OVRCameraRig causes my controller to appear as an Oculus Go model within VR (it previously correctly appeared as an Oculus Touch model). I've tried a couple of workarounds so far, including using the UnityEngine.Input rather than OVRInput, using the raw input mappings for the Touch controller, manually setting the ActiveController in OVRInput to RTouch, or overwriting the input mappings within the RTrackedRemote section of OVRInput to match those of RTouch, but none have worked so far. Has anyone encountered an issue like this before? Are there any workarounds/solutions I should try?2.5KViews0likes2Comments