cancel
Showing results for 
Search instead for 
Did you mean: 

Correct way to use Avatars SDK with Interactions SDK?

beastus
Explorer

I have gone through the Interaction SDK and now understand how to create custom hand poses and behavior, custom interactions, with controllers and/or hand tracking, and so on. However, after having just gone through the Avatars 2 SDK docs and explored the samples it is not at all clear to me how the two SDKs are meant to be used together.

 

Does anyone know of a sample or anything that illustrates how to setup a player, camera rig, etc. to use Meta Avatars with Interaction SDK objects & patterns?

Thank you!

7 REPLIES 7

beastus
Explorer

P.S. As a first crappy experiment I dropped a player controller rig from one of my Interaction SDK projects into my Avatars SDK project and got it working such that my avatar had two sets of hands - the avatar hands + the hands from my rig. It actually works to some extent in hand tracking mode if I then disable the rendering of the extraneous hands, but clearly this isn't the correct way to do it. Also, the hands on the avatars are smaller and they don't line up at all with the Hands when in controller mode. (Different sized hands, different poses in controller mode, different positioning, etc.) 

But what is the correct way to setup an OVRPlayerController, CameraRig, AvatarEntity, and so such that it works with Interaction SDK and one can create interactions and hand poses using the tools from that SDK?

As a POC I'd like to get the Interaction SDK samples working with an Avatar rather than hands. I.e., Complex Grab, Direct Touch, and Pose Detection. I feel like this must already exist somewhere?

beastus
Explorer

Alternatively, is there an easier option for building custom hand poses and object interactions? I find XR Interaction Toolkit + Unity animation to be better to work with than Interaction SDK, honestly. It took me way too long to recreate my XR-based drawing tools using Interaction SDK and had to resort to hacks.

It would be nice to have a reference mesh for the 2.0 avatars, at least the hands, to be able to rig in a normal Unity way and record animations from Unity. I've seen the CustomHandPose example but it's extremely crude. The code also contains typos and comments like:

// HACK: Random rotations allow us to pass the BodyAPI "is in hand" check. Without it, BodyAPI overrides and goes into rest pose
// I tried to get the random value as small as possible, but at .01 variance, rest pose triggers again 😕


Curious to know what others are doing. Not getting much clarity from the Avatars SDK docs or examples.

Also, are these Meta SDKs for Unity just really half-baked? From what I'm seeing I think I cannot create the kind of experience I want, although between Horizon Worlds and Horizon Workrooms the avatar system should be capable. Perhaps it is simply much better in Native SDK?

Just pinging this thread cause I am also unclear on best practices / documentation for how to combine Avatars 2.0 and the Interaction SDK.

GiulianoDecesares
Honored Guest

Also pinging the thread

idrez
Explorer

Ping! I'd also like to see how to integrate these two technologies. 

PicoPlanetDev
Protege

Hi there, have any of you figured this out? I'm still stuck on the same problem and trying to use the Synthetic hands as a data source for the avatar.

PicoPlanetDev
Protege

In case anyone is still searching:

A solution to the avatar synthetic hands problem (in case anyone comes searching the discord for this in the future): I was looking through the Unity-Decomissioned project for some reason or another yesterday and I discovered a few avatar scripts, one of which was a replacement for the Sample Input Manager that let you select a source for the HMD and hands. The code was old and I had to replace a GetRootPose with a TryGetRootPose (didn't do anything about the case where it fails though) so I thought I'd share my edits in a github repo: https://github.com/PicoPlanetDev/interaction-sdk-avatars2-integration/tree/main

I included a few comments and screenshots to make it easy to follow.