Using Oculus Quest controller for clicking on Buttons
Hi folks, I'm currently building a prototype for a project (using unity 2019.4.0f1). That means it doesn't have to be perfect and there are just a few functions involved. So far I created the environment and added a dialogue system, which pops up when the button is hit. This is working absolutely fine when I'm in Unity and try it on play mode. But when I try it out on my Oculus Quest, everything is shown correctly but I'm not able to click the button (as I did in play mode with the mouse). I'm using the custom hands and are able to grab things - but it seems I'm too stupid to click buttons. My brain tells me that there is a simple solution, but right now I'm more than desperate. I hope there's someone out there, who can help me. Btw I'm a newbie and try my best - be kind. Thank you in advance! I don't know if it helps but here are my Hierarchy... The button in the inspector ...and the button in game6.8KViews0likes5CommentsQuest 2 controller vibration and Meta Haptics Studio
Hi guys, I am currently using Quest 2 and Meta Haptics Studio to design haptic feedback with controllers. In Meta Haptics Studio, vibration amplitude and frequency range from 0 to 1 without showing information about how strong they are in specific units. Could anyone please tell me how strong the vibration and frequency are when they are 0.5 or 1 in Meta Haptics Studio? Thank you!575Views0likes0CommentsHow can I make Laser pointers (UIhelpers) be seen on the both hands simultaneously?
I create a keyboard in my VR app and I want to use both of my hands to type the text. Therefore, I want to enable the laser pointer on both of my hands simultaneously. As the UIhelpers prefab has an Event System on it, which is supposed to exist in a single instance, I can't just duplicate the UIhelpers prefab. Is there a way to do it?473Views0likes0CommentsCan't record? Only one hand works? Rebuild your Manifest
If you're having problems since the last update, it's probably the manifest. In Unity, go to the Oculus menu, then Tools, then Remove Android Manifest.xml And then, also in the Tools menu, Create Store-Compatible AndroidManifest.xml That has fixed those issues for me, and apparently fixes a few other things as well. Good Hunting, -Chilton2KViews0likes6CommentsHow Can I GetDown RIndexTrigger On Oculus Quest?
Hello~ XD I want to use Trigger Button On Oculus Quest Controller! but I dont know use button Trigger Aation. I used Anything OVRInput API. ex) - OVRInput.GetDown(OVRInput.RawButton.LIndexTrigger) - OVRInput.GetDown(OVRInput.RawButton.RIndexTrigger) - OVRInput.GetDown(OVRInput.Button.PrimaryIndexTrigger) - OVRInput.GetDown(OVRInput.Button.SecondaryIndexTriger) ... etc Can I share TriggerAction On Quest? Thank you everything.2.3KViews0likes3CommentsControllers position resetting
I am having a very strange error recently occur. To test this I have made a very simple Unity test scene with just an OVRPlayerController, a plane to stand on, and a cube attached to each the LeftControllerAnchor and the RightControllerAnchor. Before Friday (June 28th 2019), this set up worked exactly as expected. The two cubes followed the position of the controllers. And the joysticks moved the player around. After Friday, only one controller is visible at start up, and when you press any button on the other controller, then that controller become visible and tracks correctly. I've also discovered that the other controller has teleported to the position of the player controller so it hasn't disappeared, its just in front of the near clipping plane. If you move your head back far enough you can see it, but it doesn't move. I'm at a loss for what is happening. Everything was working fine before Friday.1.2KViews0likes3CommentsGrabbing not working
I'm having a lot of trouble getting grabbing to work in the demos. I have a few issues: 1. I can only seem to have one "active" controller. Meaning, when I press buttons on my right controller, it pops to my side and works as normal, but the left one disappears back to where it was "anchored" in the scene. I can never have them both at my side functioning. 2. The finger movements and grabbing doesn't work. I have only gotten grabbing to work if I use a DistanceGrabber and change the Grabber script to be L/R Touch Tracker (the hand script is just L/R Touch). Even then, the hand doesn't make any animations when grabbing. I've followed a bunch of tutorials, even a very recent one here (https://www.youtube.com/watch?v=weL4aRe1FRk). They all seem to have hands working pretty well out of the box, at least the intricate finger tracking. I'm using 2019.1.10f1 with all recommended build settings. Can anyone help me out?793Views0likes1CommentHow to manually set up Hands/Controllers?
I'm trying to create my own player prefab, where the hands and controllers are referenced and can be individually toggled on and off. Right now, the hand animations don't line up with the controller - compared to how it works perfectly by enabling 'Start with Controllers' in OVRAvatar. I'm trying to figure it out, but it's not easy with Oculus' implementation. Oculus only sets up the hands and controllers in the headset, and since this is for Quest, I can't see this in the editor. I can't find any documentation or posts on how to do this.426Views0likes0Comments[Unity] Reproducing controller system from main Oculus menu
Hi, In the Oculus quest's menu any button triggered switch the main active controller (Left or Right). I would like to implement the same system in my Unity application. I've tried to use OVRInput.GetConnectedControllers() and OVRInput.GetActiveController() but both returned "Touch" value. Nevertheless, I have set each Controller prefab's OVRTrackedRemote script to LTouch and RTouch, I would expect those method to return at least one of those value. I've also tried OVRPlugin.GetDominantHand() and the result was always the right hand. Did anyone here managed to reproduce in Unity how the controllers work in the Oculus menu ? Thanks in advance.586Views0likes0Comments