Is it possible to use hand tracking with just one hand?
Hi everyone! I’ve been experimenting with ways to give players a more super-real experience. After attending a gesture recognition session yesterday, I started thinking about removing the hand controllers entirely, but there’s a problem: without them, I can’t let players move through hand tracking alone. My game includes boss fights that require precise movement to dodge attacks. So I’m wondering, is it currently possible to have one hand use gesture tracking while the other hand uses a controller joystick for movement?5Views0likes0CommentsHow Improve Hand Tracking Pinch Accuracy
I am creating a system that uses pinch movements of the thumb and each finger, but the pinch accuracy is not good. In particular, there are issues with pinch accuracy between the ring finger and pinky finger, and with the pinch of the middle finger and ring finger responding simultaneously. Please let me know if you have any suggestions for improvement30Views0likes1CommentHow to achieve recognition of a set of dynamic gesture movements rather than individual actions.
Hello everyone, I’m wondering how to record a complete set of gesture movements for the Quest 3 to recognize—for example, having it detect when I stretch out my palm and wave, rather than only recognizing when I’m holding my palm still in front of me. Could anyone help me with this? Thank you so much. To be honest, the Quest 3’s user gesture interaction experience has left me a bit frustrated. I have to rely on my thumb and index finger for extended periods to perform operations—like making a fist and extending my index finger to click, or pinching the objects or interfaces I want with my thumb and index finger. It was okay at first, but after prolonged use, this puts a significant strain on my fingers. Moreover, it’s not smooth enough to fully integrate the operations into my daily life experience. So I want to make some attempts. I hope the Quest 3’s gesture interaction experience can be as smooth as that of excellent web front-end pages—seamlessly integrating into daily life like the ones shown in AR concept videos, rather than just generating a few virtual screens in front of you. At the very least, it shouldn’t keep straining my index finger anymore; my fingers are really sore. If you share the same thoughts or want to communicate with me, feel free to send an email to luoyiheng2005@outlook.com. I look forward to your emails.28Views0likes1CommentSystem Menu appears on finger pinch – how to disable it?
Hi everyone, I’m developing a standalone VR app for Meta Quest 3 that uses hand tracking (no controllers). However, when users pinch their thumb and index finger, the System Menu unexpectedly appears — interrupting the experience. This seems to be the default system gesture for opening the Meta Menu in hand-tracking mode. For a VR application, this causes a very uncomfortable and distracting experience for users — especially when they accidentally trigger the system UI during normal interaction. I’d like to ask: 1. Is there any official way or API to disable or suppress the system menu gestures (like the pinch or palm-up menu)? 2. Can this be done at runtime through Unity / OpenXR / Meta SDK settings? 3. If not, are there enterprise or MDM settings (e.g., ManageXR, ArborXR, etc.) that allow disabling these gestures? Any confirmed solution or workaround would be greatly appreciated — even a partial one (for example, disabling only the pinch gesture while keeping hand input active). Thanks in advance! Naetib193Views1like10Comments[Observation] [Quest OS v81.1034] Wrist Meta Button option missing – behavior differs from doc
After updating to Quest OS v81.1034 (Nov 05, 2025), I noticed that the Wrist Meta Button setting has been removed (or hidden) from the Advanced Settings menu. The official Wrist Buttons design doc is still available and unchanged, so I’m hoping this might be part of an ongoing revision or bug fix. Before this update, the feature had a noticeable inconsistency with the official documentation: According to the design doc, the two-hand interaction (tapping or pressing the wrist button with the opposite hand’s index finger) is the intended default, while the single-hand pinch gesture is an accessibility fallback — to be used only when it’s not comfortable or possible to use both hands. However, in actual behavior, the OS triggered the System Menu with a pinch even when both hands were tracked, making the fallback gesture always active instead of conditional. I’m hoping the removal of the Wrist Meta Button setting in v81.1034 may be related to fixing or revising this behavior. If this change is intentional, it would be great to have an update or clarification in the documentation or release notes, since the feature had strong potential for accessibility and kiosk/demo experiences. Environment: Device: Meta Quest 3 OS Version: v81.1034 (Nov 05, 2025) Hand Tracking: ON Controllers: OFF Reference: Previous discussion and bug report about pinch behavior: System Menu appears on finger pinch – how to disable it?31Views0likes0CommentsHow to show hands while using controller input to grab objects (Meta All-in-One SDK)
Hi, I'm currently developing a VR experience using Unity (2022.3.62f1) and the Meta All-in-One SDK(ver. 78.0), where users are physically holding controllers, but hands are shown instead of controllers — so that interactions appear to be done with hands, even though the input is coming from the controllers. To support this, I've enabled controllerDrivenHandPosesType so that the virtual hands animate based on controller input. Additionally, I configured the controllerButtonUsage property of the GripButtonSelector inside the ControllerGrabInteractor (under [BuildingBlock] OVRInteractionComprehensive > Left/Right Interactions) to use the Primary Button, allowing users to grab objects via button press. However, in this setup, the interaction still seems to be treated as hand-tracking-based. As a result, the HandGrabInteractable component (set via [BuildingBlock] HandGrabInstallationRoutine) follows the Pinch Grab Rules — so the user must still pinch specific fingers to grab an object, even though they're using controller input. What I’m trying to achieve: Show hands instead of controllers Allow grabbing with a controller button (e.g., Primary Button) Disable or bypass pinch gesture requirements My Question: Is there a supported way to simulate hand-based interactions visually, but use controller buttons to grab, without requiring pinch gestures? If there’s a recommended approach, workaround, or best practice to achieve this using the Meta All-in-One SDK, I would greatly appreciate any guidance. Thanks in advance!50Views0likes1CommentRetargeting Configuration Editor no longer works
Following the directions for Body Tracking for Movement SDK, everything works fine until you get to this section: https://developers.meta.com/horizon/documentation/unity/move-body-tracking/#import-a-character I am using a MakeHuman character with a game engine rig (basic rig) These seem to be correct from videos/documentation. The next screen, none of the preview sequences work As well, there are several errors that happen when the mouse moves across the 3D model view Errors: 1. IndexOutOfRangeException: Index -1 is out of range of '195' Length. Unity.Collections.NativeArray`1[T].FailOutOfRangeError (System.Int32 index) (at <7b8172fcdd864e17924794813da71712>:0) Meta.XR.Movement.Retargeting.SkeletonRetargeter.Update (Unity.Collections.NativeArray`1[T] sourcePose, System.String manifestation) (at ./Library/PackageCache/com.meta.xr.sdk.movement@b0553630bc45/Runtime/Native/Scripts/Retargeting/SkeletonRetargeter.cs:405) Meta.XR.Movement.Retargeting.CharacterRetargeter.CalculatePose (Unity.Collections.NativeArray`1[T] sourcePose) (at ./Library/PackageCache/com.meta.xr.sdk.movement@b0553630bc45/Runtime/Native/Scripts/Retargeting/CharacterRetargeter.cs:292) Meta.XR.Movement.Editor.MSDKUtilityEditorPreviewer.DrawPreviewCharacter () (at ./Library/PackageCache/com.meta.xr.sdk.movement@b0553630bc45/Editor/Native/Scripts/MSDKUtilityEditorPreviewer.cs:217) Meta.XR.Movement.Editor.MSDKUtilityEditorWindow.UpdateSkeletonDraws () (at ./Library/PackageCache/com.meta.xr.sdk.movement@b0553630bc45/Editor/Native/Scripts/MSDKUtilityEditorWindow.cs:1248) Meta.XR.Movement.Editor.MSDKUtilityEditorWindow.OnSceneGUI (UnityEditor.SceneView sceneView) (at ./Library/PackageCache/com.meta.xr.sdk.movement@b0553630bc45/Editor/Native/Scripts/MSDKUtilityEditorWindow.cs:517) UnityEditor.SceneView.CallOnSceneGUI () (at <8081513dc2364383b8289d30d2169b2e>:0) UnityEditor.SceneView.HandleSelectionAndOnSceneGUI () (at <8081513dc2364383b8289d30d2169b2e>:0) UnityEditor.SceneView.DoOnGUI () (at <8081513dc2364383b8289d30d2169b2e>:0) UnityEditor.SceneView.OnSceneGUI () (at <8081513dc2364383b8289d30d2169b2e>:0) UnityEngine.UIElements.IMGUIContainer.DoOnGUI (UnityEngine.Event evt, UnityEngine.Matrix4x4 parentTransform, UnityEngine.Rect clippingRect, System.Boolean isComputingLayout, UnityEngine.Rect layoutSize, System.Action onGUIHandler, System.Boolean canAffectFocus) (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.IMGUIContainer.HandleIMGUIEvent (UnityEngine.Event e, UnityEngine.Matrix4x4 worldTransform, UnityEngine.Rect clippingRect, System.Action onGUIHandler, System.Boolean canAffectFocus) (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.IMGUIContainer.DoIMGUIRepaint () (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.UIR.RenderChainCommand.ExecuteNonDrawMesh (UnityEngine.UIElements.UIR.DrawParams drawParams, System.Single pixelsPerPoint, System.Exception& immediateException) (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) Rethrow as ImmediateModeException UnityEngine.UIElements.UIR.RenderTreeManager.RenderSingleTree (UnityEngine.UIElements.UIR.RenderTree renderTree, UnityEngine.RenderTexture nestedTreeRT, UnityEngine.RectInt nestedTreeViewport) (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.UIR.RenderTreeManager.RenderRootTree () (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.UIRRepaintUpdater.Render () (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.BaseVisualElementPanel.Render () (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.Panel.Render () (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.UIElementsUtility.DoDispatch (UnityEngine.UIElements.BaseVisualElementPanel panel) (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.UIElementsUtility.UnityEngine.UIElements.IUIElementsUtility.ProcessEvent (System.Int32 instanceID, System.IntPtr nativeEventPtr, System.Boolean& eventHandled) (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.UIEventRegistration.ProcessEvent (System.Int32 instanceID, System.IntPtr nativeEventPtr) (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.UIElements.UIEventRegistration+<>c.<.cctor>b__1_2 (System.Int32 i, System.IntPtr ptr) (at <58affde3b6cc47f39fa7e8b94d5890c0>:0) UnityEngine.GUIUtility.ProcessEvent (System.Int32 instanceID, System.IntPtr nativeEventPtr, System.Boolean& result) (at <9fed903c750c40ad88e137acb27455b3>:0) 2. GUI Error: You are pushing more GUIClips than you are popping. Make sure they are balanced. UnityEngine.GUIUtility:ProcessEvent (int,intptr,bool&) These errors persist after closing the configuration editor. Thank you for reading and for all of these components and hard work put into making a great developer experience!Solved34Views0likes1Comment