Controller snaps to Hand Wrist (Simultaneous tracking of Controllers and Hands)
Hi everyone, I have an issue that I enabled simultaneous tracking for both controllers and hands, everything works an all, just the controller is stuck in the hand all the time. I'm using Unity version 2022.3.52f1 and the Meta XR All-In-One-SDK version 71.0.0. About the setup: both Controllers and the Hand Tracking have the Show State "Always" since I want them tracked and in the scene constantly. Under Camera Rig "Simultaneous Hands and Controllers" is ticked as well as more below "Launch simultaneous Hand and Controllers mode". Body Tracking is by the way NOT enabled. Now further about the problem, when I have my controller in the hand it shows hand and controller, which is fine of course, but when I let go of the controller the virtual controller snaps to the wrist of my tracked hand and the virtual controller doesn't track my real controller anymore. When I pick it up again, the controller is tracked like nothing happened. This is for both hands and I couldn't achieve the effect I want like in this video, where both hands and controllers are constantly tracked and visible, no snapping to the wrist, but the controller stays where I put it down: https://youtu.be/ShIeLRZbUEA?si=UWLEchRaYYu9JS7M&t=348 I really hope you guys know what to do here, thanks ahead for the answers already! Perhaps someone from Meta? GoodGameGomez MetaStoreHelp673Views1like1CommentMore then one OVR cameraRig
Hello All, i´m trying to develop a VR based multiplayer app. i'm using Oculus quest and PUN2 for this. mz question is: is it possible to implement more than one OVRCameraRigs in oculus quest? the app is working perfectly with single player but hands are not getting synch in multiplayer scene.1.2KViews0likes3CommentsOpenXR hand-tracking + depth api - is it possible?
Is it possible to have both Unitys OpenXR-plugin, Oculus OpenXR Plugin and Oculus XR Plugin in the same project somehow? Would like to use default OpenXR for hand-tracking (so it's easier to port to other XR compatible headsets), and then add Oculus XR-plugins features as a progressive enhancement. Ex using Unity XR Hands: https://docs.unity3d.com/Packages/com.unity.xr.hands@1.1/manual/index.html And Meta Depth API as an enhancement for Quest headsets: https://github.com/oculus-samples/Unity-DepthAPI Currently they seems to not be compatible, or am I'm missing something?2.1KViews1like5CommentsIssue with VRC.Quest.Input.8
I am currently attempting to submit my app to the Meta Store AppLab. I need some advice on the following issue: In VRC.Quest.Input1, it is suggested that in-app menus should be activated with the menu button. I have implemented my menu in the way that when you do the pinch gesture with your left palm (hand tracking), the in-app menu appears. However, the app was rejected because in VRC.Quest.Input.8 it says that for apps that support hand tracking, the system gestures are reserved and should not be used. May I have misunderstood something. Could you please advise me on how to open my hand menu when I am not supposed to use the system gesture?506Views0likes0CommentsHand skeleton data startboneId issue
Hi, I'm trying to get hand skeleton position data from quest3. I'm following https://developer.oculus.com/documentation/unity/unity-handtracking/ and https://www.youtube.com/watch?v=DR60_TCkmAY&t=305s this video. But when I return handSkeleton.GetCurrentStartBoneId(), even though I added the hand skeleton(OVRHandPrefab) into the script explicitly, it gives me start bone id is FullBody_start. The thing weird is end bone id is Hand_End properly. Is there anyone else who struggled same problem with me? Any suggestion or solution?482Views0likes0CommentsSkeleton physical colliders pose not updated during hand tracking
I run the HandsInteractionTrainScene in Quest with Unity, and find that the skeleton physical colliders are not updated with hand tracking pose. I only turn on the Render Physics Capsules and allow Hand Tracking Support in OVRCameraRig. Does anyone know what's going on? You can see the footage in the attached zip file.Solved3.9KViews1like7CommentsHand anchors stuck
Hi. I've noticed since updating to recent Oculus Integration that LeftHandAnchor and RightHandAnchor is stuck in 0,0,0 even when I can see the synthetic hands moving around in the editor. They move and snap to the accurate position once I either grab an object or pinch my fingers together. It's like they are stuck in an idle mode until a form of interaction is triggered. Is this a known bug? And is there any way to sort of "force update" the HandAnchor transform?1.5KViews1like2CommentsSwitching between hand tracking and controller and using both at the same time
Hi, I have two questions if anyone can help I would really appreciate it. 1- Is there a way two switches between hand tracking and controller using C# script in Unity at run time, right now it seems to me that the only way to switch between the tracking mode is to go to OVRCameraRig ->Hand Tracking Support -> Set the tracking mode ~ I wanna know how can I do this using script. 2- Is there a way to use hand tracking and controller tracking at the same time? (Like left-hand uses controller and right-hand uses hand tracking) Right now even if I set the tracking mode to both hand tracking and controller tracking, they can not work at the same time, only one of the tracking option can work at the time. Thank you.6.3KViews3likes6CommentsDetect Hand Tracking System gesture with OVRInput
Hi there, I'm making an app that uses Hand Tracking and I want the pause menu to appear when the user does a System Gesture with their no0n-dominant hand. In the docs it says that I should "poll Button.Start" which I have tried. This works with the touch controller but not hand tracking. Anyone have luck getting this system gesture to register? I'm posting some of my code below public UnityEvent onUse; void Update() { if (OVRInput.Get(Button.Start)) { Debug.Log("System Gesture"); onUse.Invoke(); } } Also here is the part of the documentation that talks about the system gestures Check System Gestures The system gesture is a reserved gesture that allows users to transition to the Oculus universal menu. This behavior occurs when users place their dominant hand up with an open palm towards the headset and then pinch with their index finger. When the users uses the non-dominant hand to perform the gesture, it triggers the Button.Start event. You can poll Button.Start to integrate any action for the button press event in your app logic. To detect the dominant hand, call the IsDominantHand property from OVRHand.cs and to check whether the user is performing a system gesture, call the IsSystemGestureInProgress property from OVRHand.cs. We recommend that if the IsSystemGestureInProgress property returns true , the app should provide visual feedback to the user, such as rendering the hand material with a different color or a highlight to indicate to the user that a system gesture is in progress. The app should also suspend any custom gesture processing when the user is in the process of performing a system gesture. This allows apps to avoid triggering a gesture-based event when the user is intending to transition to the Oculus universal menu.3.1KViews0likes2Comments[Hands Tracking] Interactable tools not working with custom hands
I'm using the Oculus Hands demo scene for hands tracking (HandsInteractionTrainScene) and it works correctly. However when I try to replace the default hands with custom hands following the process : Disable left OVRHandPrefab Disable right OVRHandPrefab Disable Left OVRControllerPrefab Disable Right OVRControllerPrefab Add OVRCustomHandPrefab_L OVRCustomHandPrefab_R Replace in HandsManager the default references to left and right hands prefab with the new OVRCustomHandPrefab_L & OVRCustomHandPrefab_R I can see the custom hands working fine but the FingerTipPokeTools and the RayTool do not appear and therefore I cannot interact with the Buttons or other interactables in the scene. 😡 What am I missing? Please refer to the screenshots for additional details1.6KViews0likes1Comment