02-13-2025 01:19 PM - edited 02-14-2025 12:29 PM
I set the transform of two arrow meshes to the Grip Position and Rotation of each hand and recorded this quick video. Is this normal?
UPDATE: Get Pointer Pose is looking like it might be usable. Not sure why I can't get finger bone, capsule collision, or palm positions.
I'm using the oculus-5.4 branch from github, and started with the MRTemplate project.
Using motion controllers is great, and the grip position is in a good place. However, when I use hand tracking, the grip position is down at the wrist, far from where your grip actually is. I am having trouble finding a way to get the position of the hands where the grip should be.
I can manipulate the grip position after getting it from the SDK if I'm currently in hand tracking, but it feels like I shouldn't have to do that. Also, I've been trying, and it's not working out so far. I have to use Right Vector for one, Forward Vector for another, and use the Negative of that Vector sometimes depending on Left or Right Hand, and then it's still not good enough.
What am I missing? There is a Palm Position in MotionControllerData but it's coming out 0,0,0. I also tried Get Hand Joint Position, and that is returning Value Found = False for Joint Index 0-25 that I tested.
See screenshot below for my modified version of MRTemplate's MRPawn class's Scan For Grab Target where I was trying to work with the grip info.
02-15-2025 04:39 PM
Hi there,
We're always happy to see more and more people are taking advantage of the many developer tools available, and we'd love to help you out with this! You can find information on everything developer related in the Developer Support Center, we have left the link for you below.
02-17-2025 01:05 PM
Thanks for the reply! While I had been in the dev support center and trying samples from github, this still led me to a thorough revisit where I found more samples and plugins that I needed for my next steps.
The tools are great, this stuff is so fun!