cancel
Showing results for 
Search instead for 
Did you mean: 

Experimenting with wiring Oculus Interaction SDK to New Meta Avatars

GameSe7en
Explorer

Pleased with the results of positioning a Pinch Grab for a game piece in my project.

 

metaAvatarInteractionSDK.gif

7 REPLIES 7

GameSe7en
Explorer

Lol, I get this working the day a new version of the SDK drops, changing a lot of involved components.  Just my luck 😩😜

beastus
Explorer

Can you share how you set things up to use the Avatars 2 SDK and Interaction SDK together? I'm not clear on how one should set things up. Thanks!

Hey, sorry I was slow to notice this request.

 

Here are some of the key concepts you'll need to connect these APIs.

 

First, if you've got the Interaction SDK working, you'll have access to a "HandVisual" script for each of the right and left hands.  This is where the visual representation of each hand is maintained.  In this data are rotations for each of the 17 finger joints that the Avatar API needs for finger tracking and skinning.

 

The key thing you will need is to register a HandTrackingDelegate in Avatar2 SDK's OvrAvatarManager.  So something like this:

 

OvrAvatarInputManager _inputManager;  // Get a reference to OvrAvatarInputManager, i.e. "SampleInputManager" will suffice if you are using it

_inputManager.BodyTracking.HandTrackingDelegate = new InteractionSdkHandTracker(_inputManager.BodyTracking.HandTrackingDelegate, mappedBoneRotations);

 

The code above will replace replace the standard Hand Delegate with a custom one you must define.

 

The custom Hand Delegate will be called during Avatar2 API's update code.  While inserting my own delegate, you may notice I'm first passing the current delegate in the constructor, that is because I'm chaining the call and want certain information (i.e. isHandData) from the existing API.  This might not be the most efficient, but it served my needs.  YMMV.

 

Here is the framework for the custom Hand Delegate:

 

   internal class InteractionSdkHandTracker : IOvrAvatarHandTrackingDelegate
    {
        private IOvrAvatarHandTrackingDelegate baseHandTracking;
        private Transform[] mappedBoneRotations;
        internal InteractionSdkHandTracker(IOvrAvatarHandTrackingDelegate handTrackingDelegate, Transform[] mappedBoneRotations)
        {
            this.baseHandTracking = handTrackingDelegate;
            this.mappedBoneRotations = mappedBoneRotations;
        }
        public bool GetHandData(OvrAvatarTrackingHandsState handData)
        {
            bool isHandData = baseHandTracking.GetHandData(handData);

// This could be a for loop, but I unrolled it since Avatar2 SDK does similar, possibly more efficienct
            if(isHandData) {
                handData.boneRotations[0] = ConvertQuat(mappedBoneRotations[0].localRotation);
                handData.boneRotations[1] = ConvertQuat(mappedBoneRotations[1].localRotation);
                handData.boneRotations[2] = ConvertQuat(mappedBoneRotations[2].localRotation);
[ . . . ]
                handData.boneRotations[32] = ConvertQuat(mappedBoneRotations[32].localRotation);
                handData.boneRotations[33] = ConvertQuat(mappedBoneRotations[33].localRotation);
            }

            return isHandData;
        }

        private ovrAvatar2Quatf ConvertQuat(Quaternion q)
        {
            // Avatar2 SDK uses different Axis Orientation.  This code will convert from InteractionSDK to AvatarSDK
 
            ovrAvatar2Quatf result = new ovrAvatar2Quatf();
            result.x = q.x;
            result.y = -q.y;
            result.z = -q.z;
            result.w = q.w;
            return result;
        }

 

The last thing you need are the "mappedBoneRotations".  These are references to the 34 Finger Bones (17 from each hand) that the Avatar SDK needs from Interaction SDK's HandVisual class.

 

These happen to be the Transforms from HandVisual's "Joints" attribute, starting from Joints[2] to Joints[18] for each hand.  These correspond to HandVisual's HandThumb0 through HandPinky3.

 

Here is the whole class, you can put a reference to this script on the AvatarSdkManagerHorizon class, or wherever else it suits your project.  Wire in the HandVisual and OvrAvatarInputManagers references and this should get you started in the right direction.

 

Apologies for any rough edges.  I put the code aside after going down a DOTS/ECS rabbit hole so I haven't been as active with it for a few weeks.

 

using Oculus.Avatar2;
using Oculus.Interaction;
using Oculus.Interaction.Input;
using System;
using System.Collections;
using UnityEngine;
using static Oculus.Avatar2.CAPI;

public class AvatarHandTrackerDelegate : MonoBehaviour
{
    // Start is called before the first frame update
    [SerializeField] private OvrAvatarInputManager _inputManager;
    public HandVisual leftHand, rightHand;
    private Transform[] mappedBoneRotations;

    private void Awake()
    {
        mappedBoneRotations = new Transform[34];
    }
    void Start()
    {
        MapLeftHandBones(mappedBoneRotations, leftHand);
        MapRightHandBones(mappedBoneRotations, rightHand);

        _inputManager.BodyTracking.HandTrackingDelegate = new InteractionSdkHandTracker(_inputManager.BodyTracking.HandTrackingDelegate, mappedBoneRotations);
    }

    private void MapRightHandBones(Transform[] mappedBoneRotations, HandVisual rightHand)
    {
        for (int i = 0; i < 17; ++i)
            mappedBoneRotations[17+i] = rightHand.Joints[i + 2];
    }

    private void MapLeftHandBones(Transform[] mappedBoneRotations, HandVisual leftHand)
    {
        for (int i = 0; i < 17; ++i)
            mappedBoneRotations[i] = leftHand.Joints[i + 2];
    }

    // Update is called once per frame
    void Update()
    {
        
    }

    internal class InteractionSdkHandTracker : IOvrAvatarHandTrackingDelegate
    {
        private IOvrAvatarHandTrackingDelegate baseHandTracking;
        private Transform[] mappedBoneRotations;
        internal InteractionSdkHandTracker(IOvrAvatarHandTrackingDelegate handTrackingDelegate, Transform[] mappedBoneRotations)
        {
            this.baseHandTracking = handTrackingDelegate;
            this.mappedBoneRotations = mappedBoneRotations;
        }
        public bool GetHandData(OvrAvatarTrackingHandsState handData)
        {
            bool isHandData = baseHandTracking.GetHandData(handData);

            if(isHandData) {
                handData.boneRotations[0] = ConvertQuat(mappedBoneRotations[0].localRotation);
                handData.boneRotations[1] = ConvertQuat(mappedBoneRotations[1].localRotation);
                handData.boneRotations[2] = ConvertQuat(mappedBoneRotations[2].localRotation);
                handData.boneRotations[3] = ConvertQuat(mappedBoneRotations[3].localRotation);
                handData.boneRotations[4] = ConvertQuat(mappedBoneRotations[4].localRotation);
                handData.boneRotations[5] = ConvertQuat(mappedBoneRotations[5].localRotation);
                handData.boneRotations[6] = ConvertQuat(mappedBoneRotations[6].localRotation);
                handData.boneRotations[7] = ConvertQuat(mappedBoneRotations[7].localRotation);
                handData.boneRotations[8] = ConvertQuat(mappedBoneRotations[8].localRotation);
                handData.boneRotations[9] = ConvertQuat(mappedBoneRotations[9].localRotation);
                handData.boneRotations[10] = ConvertQuat(mappedBoneRotations[10].localRotation);
                handData.boneRotations[11] = ConvertQuat(mappedBoneRotations[11].localRotation);
                handData.boneRotations[12] = ConvertQuat(mappedBoneRotations[12].localRotation);
                handData.boneRotations[13] = ConvertQuat(mappedBoneRotations[13].localRotation);
                handData.boneRotations[14] = ConvertQuat(mappedBoneRotations[14].localRotation);
                handData.boneRotations[15] = ConvertQuat(mappedBoneRotations[15].localRotation);
                handData.boneRotations[16] = ConvertQuat(mappedBoneRotations[16].localRotation);
                handData.boneRotations[17] = ConvertQuat(mappedBoneRotations[17].localRotation);
                handData.boneRotations[18] = ConvertQuat(mappedBoneRotations[18].localRotation);
                handData.boneRotations[19] = ConvertQuat(mappedBoneRotations[19].localRotation);
                handData.boneRotations[20] = ConvertQuat(mappedBoneRotations[20].localRotation);
                handData.boneRotations[21] = ConvertQuat(mappedBoneRotations[21].localRotation);
                handData.boneRotations[22] = ConvertQuat(mappedBoneRotations[22].localRotation);
                handData.boneRotations[23] = ConvertQuat(mappedBoneRotations[23].localRotation);
                handData.boneRotations[24] = ConvertQuat(mappedBoneRotations[24].localRotation);
                handData.boneRotations[25] = ConvertQuat(mappedBoneRotations[25].localRotation);
                handData.boneRotations[26] = ConvertQuat(mappedBoneRotations[26].localRotation);
                handData.boneRotations[27] = ConvertQuat(mappedBoneRotations[27].localRotation);
                handData.boneRotations[28] = ConvertQuat(mappedBoneRotations[28].localRotation);
                handData.boneRotations[29] = ConvertQuat(mappedBoneRotations[29].localRotation);
                handData.boneRotations[30] = ConvertQuat(mappedBoneRotations[30].localRotation);
                handData.boneRotations[31] = ConvertQuat(mappedBoneRotations[31].localRotation);
                handData.boneRotations[32] = ConvertQuat(mappedBoneRotations[32].localRotation);
                handData.boneRotations[33] = ConvertQuat(mappedBoneRotations[33].localRotation);
            }

            return isHandData;
        }

        private ovrAvatar2Quatf ConvertQuat(Quaternion q)
        {
            ovrAvatar2Quatf result = new ovrAvatar2Quatf();
            result.x = q.x;
            result.y = -q.y;
            result.z = -q.z;
            result.w = q.w;
            return result;
        }
    }
}

 

Hey thanks for posting the above. I was able to add in the hand visuals and the input manager, but im not sure where/how to plug in the poke/grab interactors onto the hands since I only use the visuals. the Hands in the InputOVR have other objects that tie into the InputOVR camera rig which the the Avatar horizons manager doesn't have. what did you do for that?

andrew2106
Explorer

Awesome help, but could you share some short video tutorials?Thanks

Hi GameSe7en,
Would you mind sharing the unity package for our reference so that we know how you did it?

This is great, thanks for sharing. I was wondering if you ever made it work with controllers? I was looking into the Input Controller delegate, but I can only set button/trigger states. I wish I could use the synthetic from the controller's hand into the avatar's hand just as you did!