Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
michiyo0's avatar
michiyo0
Honored Guest
6 years ago

Force runtime poses on Hand Tracking with Quest

Hello, I am programming in Unity for Oculus Quest and I am trying to overlay some hand gestures over the tracked hands. By overlay I mean force the finger bones to be in a certain pose during update, so momentarily stopping finger tracking (or at least severely limiting it).

I managed to do some gesture recognition using Valem's tutorial but when I try to move the hands in the pose using the inverse function I used for detection the mesh is all weird. What I do to set the poses now:
 fingerBones.Transform.position = skeleton.transform.TransformPoint(target.fingerData) 
Sometimes the mesh is correct but not always. I think the closer to the actual gesture, the better the mesh. It also seems the bones move in the correct positions but the mesh doesn't follow the bones completely. I looked at how they pose the hand in the avatar and they used some cache version of the hands. I also saw that the hands for hand tracking also have cachedHandState in OVRPlugin.cs but I can't really understand how to fix this yet. Beside trying posing the hands with other transforms I don't have much idea how to get past this. Any help is appreciated!

Thank you.