Quest 3 seemingly "random" recentering with palm up gesture
Just doing a sanity check here to see if other developers are experiencing any issues with the oculus doing weird things recently in regard to using hand interactions and the Meta reorientation button. After around July this year I've been experiencing an undesired automatic system recenter whenever the hands are palm up facing the camera. At first, I thought it was a random thing or due to testing the device with PCVR link, because this "auto" recentering is not 100% consistent - seemed random until recently. I ignored it up till now because I was too busy to address it. Recently though, I've seen a few others posting about it, but not enough to confirm if it's just our implementations or a Meta OS thing. But the more I test, the more I realize it is some fundamental change in one of the many recent Meta OS updates. On top of all this it doesn't recenter properly - at least not in my case. I am making a training app where I transition the user through lessons with some freedom to move if they're able, but it is designed for schools with potentially small, confined spaces. I also have my own reorient button and algorithm that the user can use to reorient themselves if they need to face a specific direction to fit with their space, so that the lesson items are always in front of them. It works perfectly. However, if they use the Meta button it will not always reorient them correctly - slightly off. Additionally, there is a problem with a seemingly random activation throughout my app when the users hand happens to turn palm up. I just so happen to have a "palm" menu that activates - by you guessed it - a palm up gesture. So, this annoying random recentering is more noticeable than usual. I am using hand interactions exclusively with floor tracking origin type - Unity 2022.3.19f, Quest 3, with Meta's All In One SDK v64. Just found this... Oculus hand tracking palm menu is automatically re... - Meta Community Forums - 12436512.8KViews5likes8CommentsTransformRecognizerActiveState is never activating
I haven't been able to get a Transform Recognizer Active State to register as active. I have no issues with Shape Recognizer Active State. Are there any additional steps that are required in order to get a TransformRecognizerActiveState to work? I have also examined it with an Active State Debug Tree UI and it confirms that the shape recognizer is activating but not the transform recognizer. Here is my component setup: Can anyone see or guess what I might be doing wrong?Solved2.5KViews0likes4CommentsPrevent Hands from going through table.
Hi, I am using the Unity Movement SDK to controle an Avatar in my unit scene. I want the player to sit in front of a table in real life and place his hands on it. In VR, the hands should also be on a table. I can roughly adjust it so that the hands are on the table. The only problem is that there are always small inaccuracies when tracking and the AVatar's hands keep disappearing into the table. Is there a way to prevent the hands from sliding through the table and always lying on it?659Views0likes1CommentHow to throw with hand tracking
I want to throw objects in MR with hand tracking, but am unable to do so. I use the building blocks to add everything into place, but when I release an object it simply drops straight down. I already checked if all the components from the guide (https://developer.oculus.com/documentation/unity/unity-isdk-throw-object/) are in place, and this should be set up. Any tips are appreciatedSolved1.6KViews0likes3CommentsBug report: Grabbable not positioning properly when force releasing then force selecting
Unity Version 2022.3.40f1 Meta XR Interaction SDK Version 69.0.2 I am currently using hand tracking with a Grabbable interactable. The interactable is a gun that I do not want the player to be able to drop so I use ForceSelect on spawn to bring the gun to the players hand. I want the player to be able to swap hands by grabbing the handle of the gun so I have a second mirrored Hand Grab Interactive with both max interactors set to 1. The main Grababble script is also set to max grab points 1. I can grab the gun with my left hand and swap fine. It is when I then grab the gun back from my left hand when the Grabbable then gets set to world point zero, and my gun objec moves around on a pivot from world zero. The only way to fix this is to open the oculus menu then resume, the gun then teleports into the hand it should be in. I reset the local position and rotation of my grabbables because they break when the application loses focus and start pivoting from world zero. This fix works for when I open the oculus menu and return but not for when grabbing back the gun from my left hand. private void SwapHands(eHand newHand) { _secondHandleGrabbed = false; _interactorLeft.ForceRelease(); _interactorRight.ForceRelease(); if (newHand == eHand.Right) { _hand = eHand.Right; _interactorRight.ForceSelect(_handGrabbableRight, false); } else { _hand = eHand.Left; _interactorLeft.ForceSelect(_handGrabbableLeft, false); } _mainGrabbable.transform.SetLocalPositionAndRotation(Vector3.zero, Quaternion.identity); _handGrabbableRight.transform.SetLocalPositionAndRotation(Vector3.zero, Quaternion.identity); _handGrabbableLeft.transform.SetLocalPositionAndRotation(Vector3.zero, Quaternion.identity); } private void OnGunGrabbedLeft(PointerEvent evnt) { switch (evnt.Type) { case PointerEventType.Select: print("grabbed with left"); SwapHands(eHand.Left); break; } } private void OnGunGrabbedRight(PointerEvent evnt) { switch (evnt.Type) { case PointerEventType.Select: print("grabbed with right"); SwapHands(eHand.Right); break; } } private void OnInputFocusAcquired() { _interactorLeft.ForceRelease(); _interactorRight.ForceRelease(); if (_hand == eHand.Right) { _interactorRight.ForceSelect(_handGrabbableRight, false); } else { _interactorLeft.ForceSelect(_handGrabbableLeft, false); } _mainGrabbable.transform.SetLocalPositionAndRotation(Vector3.zero, Quaternion.identity); _handGrabbableRight.transform.SetLocalPositionAndRotation(Vector3.zero, Quaternion.identity); _handGrabbableLeft.transform.SetLocalPositionAndRotation(Vector3.zero, Quaternion.identity); } Edit: Added images for extra contextSolved889Views0likes1CommentHow do I find the and tracking cursor
I'm using the interactivity SDK and the Ray Interactor block shows the circle cursor from hand tracking. I think surfaceHit and collisionInfo would get me the vector 3 I'm looking for but I'm not sure how to use it or if its even what I need. Anything helps250Views0likes0CommentsEnable hand and controller tracking at the same time.
Hi, I have a Oculus Quest Pro, and work on a Unity project that needs hand tracking and controller tracking for a physical object, but I can't enable hand and controller tracking at the same time. So I wonder is this possible? or is there any other ways to track a physical object using Oculus?12KViews5likes15CommentsGrab objects with weight
Hi, I would like to know if there is a way to make that when I pick up objects they move according to their weight, for example if I pick up a heavy object, when I move my hand the hand tracking should take more time to reach the position of my real hand compared to if I was holding a lighter object, I have looked for multiple solutions but I have not been able to achieve anything yet, I appreciate any help!Solved1.5KViews0likes3CommentsDetecting if joint is covered up using XR Hands
I am using Meta Quest 2 headset to track my hands with XR Hands 1.4.1 package. Is it possible to get indexes of joints that are currently unavailable for tracking (due to partially covered hands, hardware on hands, etc.)? I tried using XRHandJoint.TryGetPose() but after initial hand detection it always returns true with a Pose containing predicted position and rotation (or identity Pose filled with 0s if whole hand tracking is lost). My plan was to detect currently covered up joint and do predictions using other joint positions manually. I know there exists events that can be subcribed to when whole hand tracking is lost, is there something similar for specific joints? Any suggestions are appreciated.545Views0likes0CommentsNeed Help in meta all in one SDK in unity (hand tracking)
Hello everyone. I have a question. I'm currently using Meta Quest all-in-one SDK in Unity. What I'm trying to do is too make piano keys intractable with the hand-tracking. I put the sounds and box colliders on the keys. The problem that I'm facing is when I press on them using hand tracking, it does not work. Like it doesn’t collide with it. Is there a way to put on the default hand a collider to make it collide with the other object?608Views0likes0Comments