Hand physics Capsules not following the fingers
Hello, So I have a working project with the new version 13, when I tick "Enable Physics Capsules" in OVR Skeleton script, inside the OVRHandPrefab, the capsules behave wired. So, as soon as the hands are detected, the capsules appear basically at the tip of the selected fingers (this, through the InteractableToolsSDKDriver, and the InteractableToolsCreator script), but as soon as I move my fingers, the position of the capsules remain the same, it doesnt update the position in relation to the finger, but it travels with the whole hand. I wonder if there is a bug regarding the attached parent? Then, if I hide the hands, as soon as they appear again the capsules appear in the correct position of the finger tip, just to stay there again when the finger moves. I am guessing this is not the desired behavior, right? Thank you!8.1KViews1like22CommentsHand Grab Tracking
Hey Folks, I am designing a circular weapon(image is attached below) that can be hold by the player with two hand. When the player squeezes the weapon with both hands, it shoots. It is functioning well, but I am encountering an issue. I am using the two hand grab free transform hand grab component provided by the Meta SDK, and when the player performs a quick operation with the weapon, it rotates on the horizontal axis. Any help would be appreciated.343Views0likes0CommentsHand grab interaction fails if OVRCameraRig is child of high speed moving object
Hi all, I'm facing an unexpected scenario while using the grab feature provided by the Meta Interaction SDK. It looks like that if the OVRCameraRig object (with OVRHands and/or OVRControllerHands properly configured with HandGrabInteractors) is set as child of a "moving object" (together with the objects designed to be grabbed), depending on the speed of this object the grab actions fail. The scenario I'm working on is a train cabin moving along binaries designed as a spline path, with the VR user/player (represented by the OVRCameraRig object) parented to this cabin, since it must be inside the moving train. Inside the cabin I have a lever that must be grabbed to modify the speed of the train. At slow speeds the lever can be grabbed without problems, while increasing a little bit the speed, the lever can't be grabbed anymore! I tested it multiple times, trying to understand the root cause of the failure. I guess it's something related to the frequency of checks for the hands vs the lever intersections/collisions made by the Meta scripts. However I couldn't find any solution. Is that something I'm missing in the documentation? Maybe some script property to be set or fine tuned? NOTE: I don't think it would be acceptable, in a 3D application, to invert the scenario where the train is still and the whole environment is moving towards it.... Of course in this case both the objects to grab and the "player" would be still and the grabbing action should work like a charm. But I still believe it is an ugly workaround. Is someone having any clue on how to fix this "misbehaviour"? Thanks in advance!1.3KViews0likes2CommentsHow to force Avatar Hand to blend as Interaction Rig's hand when grabbing object
I'm trying to have the Avatar hand (from avatar sdk) to shape as the interaction rig hands when grabbing object. For example, in the samples scenes if I put the avatar in the TouchGrabExamples scene when i grab an object the interaction hand actually stop at the collider position while the avatar's one go all the way through it. Someone have an idea for solve it?974Views4likes1Comment[Hand Tracking] FingerTipPokeTool Index does not follow my index
Hello everyone. I created a simple scene with some hand tracking and added a "Interactable Tool Creator". It creates some finger tip poke so I can interact with buttons but the tip does not follow my index finger tips. Here is a video recorded on the Oculus Quest and a screenshot of my project scene. Video : https://imgur.com/a/snM3ms65.6KViews0likes12CommentsHands not appearing in-game post v37 update
We've had a couple of players report they don't see their hands in-game now after the v37 OS update with controller firmware applied. Have tested on numerous devices in the studio for pre-v37, old and new avatars enabled, no avatar set, firmware not/applied etc and seems inconsistent replication, i.e. can only replicate on one device that has v37 & controller firmware but it's fine on most other devices in same state. attached logcat1KViews2likes2CommentsHow to keep hands displayed when tracking is lost?
Hello, I'm developing a game for my master thesis with Unity using oculus hand tracking, I need to keep hands displayed and just "freeze" them when the tracking is lost. Is there a way to do so? I found the GetHandState method in the OVRHand.cs script, I tried to comment the section that checks if the tracking is lost to see what happens but the hands still disappear when tracking is lost.Solved4.5KViews0likes4CommentsHow to build a multiplayer hand tracking with Oculus Quest?
Hi I would appreciate it if someone can point me to a tutorial/documentation that talks about multiplayer hand tracking with Oculus Quest? Daniel's video on SideQuest was the only app that I see similar implementation. However, I did not see any tutorials/documentation on this. I am a bit stuck on this task and couldn't find help. I see good tutorials on how to use VR controllers over the network but not on using Oculus hand tacking in a network setup. Any insight on this would be highly appreciated. Thanks heaps.1.9KViews0likes3CommentsConverting Touch Controller Position To HandTracking Pinch Position
Hello, I have a scrip on one of my games that creates an object when you squeeze the trigger on the touch controller. I want to do the same thing again, but this time with the new hand tracking controls. I have the hand tracking fully implemented and button presses seem to work great, but I can't seem to figure out pinch gestures or how to get the location of the tip of the index finger. I think it probably has something to do with this though: PointerPose.localPosition = _handState.PointerPose.Position.FromFlippedZVector3f(); PointerPose.localRotation = _handState.PointerPose.Orientation.FromFlippedZQuatf(); //Example code to show current method: void Update() { var hand = GetComponent<OVRHand>(); //bool isIndexFingerPinching = hand.GetFingerIsPinching(HandFinger.Index); bool bDownLeft = OVRInput.GetDown(OVRInput.Button.PrimaryIndexTrigger, OVRInput.Controller.LTouch); //This needs to be if left or right pinch bool bDownRight = OVRInput.GetDown(OVRInput.Button.PrimaryIndexTrigger, OVRInput.Controller.RTouch); if (bDownLeft) { Vector3 position = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(OVRInput.Controller.LTouch)); //This needs to be the vector location of Hand_IndexTip Vector3 rotation = trackingSpace.TransformDirection(OVRInput.GetLocalControllerRotation(OVRInput.Controller.LTouch).eulerAngles); Instantiate(myPrefab, position, Quaternion.Euler(rotation)); } if (bDownRight) { Vector3 position = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(OVRInput.Controller.RTouch)); Vector3 rotation = trackingSpace.TransformDirection(OVRInput.GetLocalControllerRotation(OVRInput.Controller.RTouch).eulerAngles); Instantiate(myPrefab, position, Quaternion.Euler(rotation)); }1.6KViews1like1Comment