cancel
Showing results for 
Search instead for 
Did you mean: 

Synchronization Issue between TouchHand and OvrAvatarEntity

Beelysir
Honored Guest

I am using the Networked Avatar from Meta Building Blocks and have encountered an issue where the TouchHand functionality does not synchronize properly with the OvrAvatarEntity hand.

When I grasp an object, the TouchHand changes its grabbing pose based on the appearance of the object, while the character's hand follows my real-world hand gestures exactly.

Could you please advise me on how to achieve synchronization and consistency between the two hands?

If you could answer my question, I would be very grateful.

IMG.png

8 REPLIES 8

PaulBearerJr
Protege

This is the same exact issue that I'm dealing with now, and seems a lot of others have dealt with/ tried to get working. I found in a few other posts that someone put together a github repo explaining how to do it (and apparently it works but I haven't tried it yet) --- https://github.com/sigmondkukla/interaction-sdk-avatars2-integration/tree/main 

PaulBearerJr
Protege

Maybe a Meta admin can confirm if this is still the best way to do it with the latest updates? @CaseyAtMeta 

Thanks! 🍻

CaseyAtMeta
Community Manager
Community Manager

Thanks for the ping, I'll investigate!

Just to confirm, this implementation does look to be in line with what we'd expect a custom hand tracking input for Avatars would be!

great, thanks so much for the confirmation 

PaulBearerJr
Protege

Just in case anyone in 2025 or beyond wants to try the above GitHub example - it uses the Oculus Integration SDK which is now deprecated. So don't expect (like I did) that you can just fire up Unity 6 and follow the instructions on the github page, because you will be met with numerous compilation errors when adding that folder of scripts to your existing project (in which you're using the Meta XR All-in-One SDK) since the scripts all reference things only found in the Oculus Integration SDK. 

Now, I guess you could set up a new project using the Oculus Integration SDK (which isn't recommended because it's outdated) and try to get everything working... but keep in mind the amount of things that have it has to interact with that and that might (most likely) have changed across numerous SDKs/packages since this GitHub hotfix was done nearly 2 years ago. Maybe @PicoPlanetDev can modify what he's done here and create a new repo using the now standard Meta XR All-in-One SDK.... I know I'm being a bit overzealous but it's worth a shot. 😅

A recent video (3 months ago) from Valem tutorials highlights this issue, and he shows you how to get it working with a pre-built custom avatar (not the official Meta Avatars that you create for yourself). I'm wondering if I can take the same basic workflow from his video and apply it to Meta Avatars? I queued the video up so you can see -  https://youtu.be/_vtAyHOQRUg?si=TbQgw_gEgWDW_7tG&t=210

Needless to say, I'm hacking away at this, hoping I can get it working with the latest packages/SDKs .

HandsNoGood.pngHandsGood.png

Thank you for your assistance! I have previously watched this YouTube tutorial and tried to use the methods shown, but they only apply to the offline version. I'm looking for a way to implement it in the online version.

The tutorial mentioned adding a "skeleton process aggregator" for characters. I attempted to apply it to the characters in the network, but I couldn't directly access the "lefthandsynthetic" information. Do you have any alternative methods to resolve this issue? I really appreciate your help and response!😊

I haven't had a chance to play around with this yet. I'm hoping to work on it tonight or tomorrow.