cancel
Showing results for 
Search instead for 
Did you mean: 

Use Take Recorder to record animations from tracked hands

Laramena
Explorer

Hi everyone,

I have hand tracking activated and am trying to record the hand components using the Take Recorder plugin. My goal is to capture hand animations which I can later on apply to a NPC without having to manually animate hand poses.

 

I was able to record animations from the official HandSample when using the controllers to manipulate the hand mesh. But when switching to hand tracking the hand poses are not recorded. Instead, only rotation and position of the hand component are stored in the sequence.

 

I hope I managed to express what I am trying to achieve in a comprehensible manner. Has anyone tried to record hand animations from tracked hands using Take Recorder before and got it running? Or has an idea what I am missing?

 

Thanks for your help!

1 ACCEPTED SOLUTION

Accepted Solutions

Get Socket Transform is to get the data, while to apply the transform on your proxy hands you can use the Transform Bone node in AnimBP..

View solution in original post

7 REPLIES 7

EnterReality
Protege

I still don't have experience with the Quest2 hand tracking, but I have ton of experience with mocap in UE4, and I usually use Take Recorder to record every skeletal animation you want.

If you select a BP that has a skeletal mesh inside, that is fully recorded/baked inside the animation that is going to be saved.

If not, make sure that, within the Take recorder details, after you select the actor you want, you also make sure that the skeletal mesh of the hands it selected ( expand the BP you selected to be recorded ).

Hi, thanks for your reply! The thing is, within Take Recorder I cannot access the Skeletal Mesh of the Oculus Hand Component since I guess it's private and therefore there is no animation track. Do you know if I'm maybe missing a step to prepare the Oculus Hand Component for animation? I've seen Take Recorder running smoothly with MetaHumans which seems so much more complicated why I'm thinking I'm missing something very basic and obvious.

Oh, the fact that you can't access the hands skeletal mesh is very annoying...can you at least access the component once is spawned?
The workaround would be to access the spawned hand skeletal mesh and grab the hands/fingers rotation data, then use a separate skeletal mesh ( maybe the hands of the Mannequin ) and assign to them the rotation data from the Quest2 hand data you grabbed before.

By doing so you're using a kind of "proxy" skeletal mesh that you can then record using Take Recorder.

 

I know that this kind of workaround works because I have a lot of experience with VR gloves and retarget the data from them to realtime characters, so in case you need this to be done feel free to contact me via DM.

Sorry, my bad, I am kind of new to this. I meant to say I cannot access the skeletal mesh component within Take Recorder, it doesn't show up in the hierarchy.

 

I tried using "Get Bone Location by Name" on the oculus hand which is no problem at all. But for passing the bone location data over to my "proxy" hand I am using another poseable mesh component which again doesn't give me an animation track. Can I pass the bone location data onto a skeletal mesh component? Which function would I use to manipulate bone locations of a skeletal mesh component?

Get Socket Transform is to get the data, while to apply the transform on your proxy hands you can use the Transform Bone node in AnimBP..

Thanks, just found the nodes 🙂

RenderNinja
Honored Guest

Hello @Laramena , could you please share with us a github project where you made it? I will be happy to buy you a coffee for the share or at least please add some youtube tutorial. I am sure the community will be proud of you!