Switching between hand tracking and controller and using both at the same time
Hi, I have two questions if anyone can help I would really appreciate it. 1- Is there a way two switches between hand tracking and controller using C# script in Unity at run time, right now it seems to me that the only way to switch between the tracking mode is to go to OVRCameraRig ->Hand Tracking Support -> Set the tracking mode ~ I wanna know how can I do this using script. 2- Is there a way to use hand tracking and controller tracking at the same time? (Like left-hand uses controller and right-hand uses hand tracking) Right now even if I set the tracking mode to both hand tracking and controller tracking, they can not work at the same time, only one of the tracking option can work at the time. Thank you.6.3KViews3likes6CommentsOculus Go - character movement using touchpad press, in unity3d?
my end goal is to be able to move character forward, backward, and strafing left and right by detecting where you are touching on the touchpad and if you are pressing in on the touchpad (and not just touching it). using something like if OVRInput.Get(OVRInput.ButtonTouchpad); to see if you are pressing in on the touchpad. also OVRInput.Get(OVRInput.Axis2.PrimaryTouchpad); to see where you are touching, and then setting up values that will move you in the proper direction. i am pretty new to unity and game development. but if anyone could help me wrap my head around how to set this up it would be much appreciated. to at least get me on the right track.5KViews0likes4CommentsOVR Custom Hands are offset and have the wrong orientation
I am working in a team of 5, we are using Unity 2019.2.17 for our game and using Perforce as source control. We implemented the custom hands into our project and it worked for a while but out of nowhere my hands started being offset from my original hand position. None of my teammates have this issue and I don't see this issue in other unity projects or applications either. I have tried reinstalling unity, deleted and got the revisions of the entire project, and also re imported the oculus integration. What should I do to fix this problem?3.9KViews0likes2CommentsCross-platform input for Valve Index
Hi, we are using Oculus integration 1.42 cross-platform input for Oculus Rift & Steam VR. On HTC Vive it is working but not on Valve index. For example left thumbstick and left hand grip are not working. Are there any other button mappings or should we use their SDK? Thanks.3.4KViews4likes9CommentsHow to set up Local Avatar
Hi there, I'm trying to get the hand and the controller to show up properly on Go/GearVr. I tried to attach the LocalAvatar to the OVRCameraRig and the TrackingSpace, also tried some things as described here: https://developer.oculus.com/documentation/avatarsdk/latest/concepts/avatars-gsg-unity/ and here: http://https//forums.oculusvr.com/developer/discussion/49444/attaching-local-avatar-to-player-controller-in-unity Somehow the hand/controller are to high / low or move on the z-axis when rotating the head. How do i get LocalAvatar lined up properly ? Thank you. Anne2.8KViews0likes1CommentMeta XR Simulator - Left Hand does not receive controller button input
Hi! I wanted to explore automated testing on our app. I was able to get the Meta XR Simulator up and running, including connection with IRL quest 3 and it's controllers. Here is my problem: no button/touch interaction on the Left controller is represented on the left hand, both using mouse&keyboard input in mXRsim, as well as input on connected IRL Quest controller. At the same time all of those inputs work correctly on the right hand, both from Right controller and from mouse&keyboard. (button/touch interactions = triggers, buttons, button touch surfaces, ect.) Tracking on Left Hand (using IRL quest controller) works correctly. To check if there is something wrong with our unity project I: - tried similar interactions with WMR Mixed Reality Portal which has built in simulator - both hands button interactions are working - tried installing mXRsim on old XR Interactions Toolkit (2.5.2) sample scene - same issue as in our project - Left hand button interactions do not work, right hand interacts with a scene just fine. I do not see any errors in console indicating problems with input system. I am using Unity 2022.3.14, OpenXR and Meta XR Simulator v65 installed via Package Manager.2KViews0likes4CommentsOVRPlugin.GetSystemHeadsetType() doesn't return Oculus_Quest_2
I added the OVRController prefab to my project but on Quest 2 it always shows the controllers for te Quest. Thi is because OVRPlugin.GetSystemHeadsetType() always returns Oculus quest even if I'm using Oculus Quest 2. I'm working with the latest version of th Oculus Integration. Does anyone have the same problem?1.8KViews1like3CommentsOculus equivalent to HTC Vive Input
I downloaded an asset from the Unity Asset Store and the input method being tracked in HTC Vive. I am developing a game in Unity using the Oculus Rift. I was wondering how I could translate the following HTC Vive input to the rift controllers: //Reference to Input events for controllers private Valve.VR.EVRButtonId gripButton = Valve.VR.EVRButtonId.k_EButton_Grip; //(VRInput) private Valve.VR.EVRButtonId triggerButton = Valve.VR.EVRButtonId.k_EButton_SteamVR_Trigger;//(VRInput) I tried using OVRInput.Button for the two above and it seemed to compile. But wasn't able to test it since the following below did not work. //Use this to get consistent reference to This joystick controller thing private SteamVR_Controller.Device controller { get { return SteamVR_Controller.Input((int)trackedObj.index); } } //(VRInput) private SteamVR_TrackedObject trackedObj; Correct me if I'm wrong, but from what I've read, SteamVR_Controller.Input((int)trackedObj.index) , uses the trackedObj to get the integer representation of the controller in order for easy access to the controller. So how would use OVR in such a way? Thanks!1.7KViews0likes1CommentDetermine whether hands or controllers are currently being tracked?
Is there some sort of bool variable that determines whether hands or controllers are being tracked currently? I want the user to be handed the controllers in my virtual environment, and for something to change when that happens. Any help would be greatly appreciated, I can't find anything about this at all in the documentation or in other forums.1.6KViews1like1Comment