Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Laramena's avatar
Laramena
Explorer
3 years ago
Solved

Use Take Recorder to record animations from tracked hands

Hi everyone,

I have hand tracking activated and am trying to record the hand components using the Take Recorder plugin. My goal is to capture hand animations which I can later on apply to a NPC without having to manually animate hand poses.

 

I was able to record animations from the official HandSample when using the controllers to manipulate the hand mesh. But when switching to hand tracking the hand poses are not recorded. Instead, only rotation and position of the hand component are stored in the sequence.

 

I hope I managed to express what I am trying to achieve in a comprehensible manner. Has anyone tried to record hand animations from tracked hands using Take Recorder before and got it running? Or has an idea what I am missing?

 

Thanks for your help!

  • Get Socket Transform is to get the data, while to apply the transform on your proxy hands you can use the Transform Bone node in AnimBP..

  • Hi everyone,

    sorry, haven't been on this thread for quite a while. I cannot share the project however I can share some screenshots with you. Hope this will be helpful to someone.

    I presuppose you already have your pawn actor with the Oculus hand tracking running.

    1. You then add a SkeletalMeshComponent to the actor with the Skeletal Mesh being set to the OculusHand Mesh.

    2. Also, I added a function to access and store the Oculus Hand's bone transforms in an array. I created a child class of the Oculus Hand Component which you don't need to do. Instead you would connect the Oculus Hand Component into the target pins:

    3. Next, you create an Animation Blueprint which you assign to the Skeletal Mesh. Within the EventGraph of the Animation Blueprint you get a reference to your actor and call the 'Get Bone Transforms' function.

    4. Within the 'Split Bone Transform' function of the Animation Blueprint I assigned the transforms to separate variables:

    5. And within the AnimGraph of the Animation Blueprint you finally use the transforms to modify the bones of the Skeletal Mesh:

    These are the settings for the Transform (Modify) Bone node (adjust 'bone to modify' accordingly):

     

    Hope I did not forget anything and I explained everything in an understandable way. Let me know if you need further help. Maybe I'll find some time to create a Youtube Tutorial in the near future.

    Best, Laramena

11 Replies

Replies have been turned off for this discussion
  • Korhi's avatar
    Korhi
    Honored Guest

    Thank you very much, Laramena ! Following your suggested approach, I successfully recorded hand tracking in the latest version of UE. Afterward, I explored methods to extend tracking to include the arms. Having experimented extensively with various skeletal setups and motion retargeting techniques, I discovered an efficient and highly effective method: the Meta Movement SDK (https://developers.meta.com/horizon/documentation/unreal/unreal-movement-getting-started).

    This SDK, released by Meta in Q1 2024, offers a comprehensive full-body tracking solution. It not only includes robust hand tracking capabilities but also extends seamlessly to arm movements, with lower-body tracking powered by AI calculations. Currently, you can effortlessly implement this by downloading the Movement SDK Sample for Unreal directly from GitHub (https://github.com/oculus-samples/Unreal-Movement). After installing the Meta XR plugin, you can immediately start capturing animations using the Take Recorder without any complex configurations. Furthermore, this streamlined process enables simultaneous recording of body movements and facial capture into a single animation file.

  • Hi everyone,

    sorry, haven't been on this thread for quite a while. I cannot share the project however I can share some screenshots with you. Hope this will be helpful to someone.

    I presuppose you already have your pawn actor with the Oculus hand tracking running.

    1. You then add a SkeletalMeshComponent to the actor with the Skeletal Mesh being set to the OculusHand Mesh.

    2. Also, I added a function to access and store the Oculus Hand's bone transforms in an array. I created a child class of the Oculus Hand Component which you don't need to do. Instead you would connect the Oculus Hand Component into the target pins:

    3. Next, you create an Animation Blueprint which you assign to the Skeletal Mesh. Within the EventGraph of the Animation Blueprint you get a reference to your actor and call the 'Get Bone Transforms' function.

    4. Within the 'Split Bone Transform' function of the Animation Blueprint I assigned the transforms to separate variables:

    5. And within the AnimGraph of the Animation Blueprint you finally use the transforms to modify the bones of the Skeletal Mesh:

    These are the settings for the Transform (Modify) Bone node (adjust 'bone to modify' accordingly):

     

    Hope I did not forget anything and I explained everything in an understandable way. Let me know if you need further help. Maybe I'll find some time to create a Youtube Tutorial in the near future.

    Best, Laramena

    • slim_sheady19's avatar
      slim_sheady19
      Explorer

      Great stuff!  Can't believe this is not implemented in one of the sample projects.  I would vote for a Youtube tutorial if you have the time!

    • Catbatrat's avatar
      Catbatrat
      Honored Guest

      Thankyou so much Laramena this is exactly what I needed, I think I should be able to get things working as I need with all this!

  • Hello Laramena , could you please share with us a github project where you made it? I will be happy to buy you a coffee for the share or at least please add some youtube tutorial. I am sure the community will be proud of you!

  • I still don't have experience with the Quest2 hand tracking, but I have ton of experience with mocap in UE4, and I usually use Take Recorder to record every skeletal animation you want.

    If you select a BP that has a skeletal mesh inside, that is fully recorded/baked inside the animation that is going to be saved.

    If not, make sure that, within the Take recorder details, after you select the actor you want, you also make sure that the skeletal mesh of the hands it selected ( expand the BP you selected to be recorded ).

    • Laramena's avatar
      Laramena
      Explorer

      Hi, thanks for your reply! The thing is, within Take Recorder I cannot access the Skeletal Mesh of the Oculus Hand Component since I guess it's private and therefore there is no animation track. Do you know if I'm maybe missing a step to prepare the Oculus Hand Component for animation? I've seen Take Recorder running smoothly with MetaHumans which seems so much more complicated why I'm thinking I'm missing something very basic and obvious.

      • EnterReality's avatar
        EnterReality
        Protege

        Oh, the fact that you can't access the hands skeletal mesh is very annoying...can you at least access the component once is spawned?
        The workaround would be to access the spawned hand skeletal mesh and grab the hands/fingers rotation data, then use a separate skeletal mesh ( maybe the hands of the Mannequin ) and assign to them the rotation data from the Quest2 hand data you grabbed before.

        By doing so you're using a kind of "proxy" skeletal mesh that you can then record using Take Recorder.

         

        I know that this kind of workaround works because I have a lot of experience with VR gloves and retarget the data from them to realtime characters, so in case you need this to be done feel free to contact me via DM.