Forum Discussion
Laramena
3 years agoExplorer
Use Take Recorder to record animations from tracked hands
Hi everyone, I have hand tracking activated and am trying to record the hand components using the Take Recorder plugin. My goal is to capture hand animations which I can later on apply to a NPC with...
- 3 years ago
Get Socket Transform is to get the data, while to apply the transform on your proxy hands you can use the Transform Bone node in AnimBP..
- 2 years ago
Hi everyone,
sorry, haven't been on this thread for quite a while. I cannot share the project however I can share some screenshots with you. Hope this will be helpful to someone.
I presuppose you already have your pawn actor with the Oculus hand tracking running.
1. You then add a SkeletalMeshComponent to the actor with the Skeletal Mesh being set to the OculusHand Mesh.
2. Also, I added a function to access and store the Oculus Hand's bone transforms in an array. I created a child class of the Oculus Hand Component which you don't need to do. Instead you would connect the Oculus Hand Component into the target pins:
3. Next, you create an Animation Blueprint which you assign to the Skeletal Mesh. Within the EventGraph of the Animation Blueprint you get a reference to your actor and call the 'Get Bone Transforms' function.
4. Within the 'Split Bone Transform' function of the Animation Blueprint I assigned the transforms to separate variables:
5. And within the AnimGraph of the Animation Blueprint you finally use the transforms to modify the bones of the Skeletal Mesh:
These are the settings for the Transform (Modify) Bone node (adjust 'bone to modify' accordingly):
Hope I did not forget anything and I explained everything in an understandable way. Let me know if you need further help. Maybe I'll find some time to create a Youtube Tutorial in the near future.
Best, Laramena
Korhi
9 months agoHonored Guest
Thank you very much, Laramena ! Following your suggested approach, I successfully recorded hand tracking in the latest version of UE. Afterward, I explored methods to extend tracking to include the arms. Having experimented extensively with various skeletal setups and motion retargeting techniques, I discovered an efficient and highly effective method: the Meta Movement SDK (https://developers.meta.com/horizon/documentation/unreal/unreal-movement-getting-started).
This SDK, released by Meta in Q1 2024, offers a comprehensive full-body tracking solution. It not only includes robust hand tracking capabilities but also extends seamlessly to arm movements, with lower-body tracking powered by AI calculations. Currently, you can effortlessly implement this by downloading the Movement SDK Sample for Unreal directly from GitHub (https://github.com/oculus-samples/Unreal-Movement). After installing the Meta XR plugin, you can immediately start capturing animations using the Take Recorder without any complex configurations. Furthermore, this streamlined process enables simultaneous recording of body movements and facial capture into a single animation file.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 5 months ago
- 1 year ago