'A' button on right Quest 2 controller has lag
Whenever I press the right A button on my Quest 2 controller I get about a second lag between me pressing it and the system recognizing a button press. I have tried changing the batteries and using rechargeable and non-rechargeable batteries but it seems to make no difference. Has anyone else experienced this? How did you fix it? I'm using Unity in case it's relevant.31Views0likes2CommentsHow to show hands while using controller input to grab objects (Meta All-in-One SDK)
Hi, I'm currently developing a VR experience using Unity (2022.3.62f1) and the Meta All-in-One SDK(ver. 78.0), where users are physically holding controllers, but hands are shown instead of controllers — so that interactions appear to be done with hands, even though the input is coming from the controllers. To support this, I've enabled controllerDrivenHandPosesType so that the virtual hands animate based on controller input. Additionally, I configured the controllerButtonUsage property of the GripButtonSelector inside the ControllerGrabInteractor (under [BuildingBlock] OVRInteractionComprehensive > Left/Right Interactions) to use the Primary Button, allowing users to grab objects via button press. However, in this setup, the interaction still seems to be treated as hand-tracking-based. As a result, the HandGrabInteractable component (set via [BuildingBlock] HandGrabInstallationRoutine) follows the Pinch Grab Rules — so the user must still pinch specific fingers to grab an object, even though they're using controller input. What I’m trying to achieve: Show hands instead of controllers Allow grabbing with a controller button (e.g., Primary Button) Disable or bypass pinch gesture requirements My Question: Is there a supported way to simulate hand-based interactions visually, but use controller buttons to grab, without requiring pinch gestures? If there’s a recommended approach, workaround, or best practice to achieve this using the Meta All-in-One SDK, I would greatly appreciate any guidance. Thanks in advance!48Views0likes1CommentXR_SESSION_STATE_FOCUSED is not triggered
Hello! I have the following problem. In an application running on a Meta Quest 2 headset, when opening system menus (such as Resume/Quit by pressing the "o" key or opening InApp windows), the HandleSessionStateChangedEvent: state XR_SESSION_STATE_FOCUSED->XR_SESSION_STATE_VISIBLE event is triggered. However, when system menu closing and returning to the application, the OpenXRSession::HandleSessionStateChangedEvent: state XR_SESSION_STATE_VISIBLE->XR_SESSION_STATE_FOCUSED event no longer triggers. As a result, hand tracking stops working. I'm using latest OpenXR library in Unity3D. Best regards!Solved47Views1like2Comments