cancel
Showing results for 
Search instead for 
Did you mean: 

Animating legs with Meta avatars

TriplePete
Protege

Hi!

We are using the Meta avatars for some projects and now we are researching how to activate the legs on the avatars. On both local and remote we can render fullbody avatars (with legs) perfectly nice but I can't figure out how to animate the legs. 

In the creation info I have activated Leg IK. Except that I can't find anything more about how to activate the IK.

Leg.png

17 REPLIES 17

No progress from me - still looking for solutions.

teawa
Protege

Any update on this? I would’ve assumed a mod would’ve seen the post by now

We had some luck by using the Unity skinning rig for the avatars. This exposes the bones of the avatars, which we then saved a copy of and created an animation rig and avatar from their bone layout and then we used the asset FinalIK to ik this up and then copied the transforms of the ik'd rig's bones to the avatar's bones at runtime.

Took a while to do but eventually got there, in the end it turned out to be too performance intensive for our multiplayer application though, so we scrapped it and will just wait for meta's legs ik implementation whenever that comes

billy0620
Honored Guest

I would appreciate it if you could inform me of any recent developments. Furthermore, are there any other feasible options to animate the avatar's leg?

Sorry to say, but still no progress on this. I haven't found any solution to this. Now I am researching other avatar solutions like Avaturn (https://avaturn.me/) instead.

romi.fauzi
Honored Guest

Scratch that, I just found out that the inputTrackingState doesn't provide feet properties to be overriden.
Here is how I managed to override the hand IK, I believe it should work for the foot IK as well (I haven't really tried it though), So basically I implemented a custom class derived from OvrAvatarInputManager, and custom input tracking class that implements IOvrAvatarInputTrackingDelegate, while using the default SampleInputDelegate for the input delegate part, here is the full code of my implementation:

 

 

#if USING_XR_MANAGEMENT && USING_XR_SDK_OCULUS && !OVRPLUGIN_UNSUPPORTED_PLATFORM
#define USING_XR_SDK
#endif

using Cysharp.Threading.Tasks;
using Oculus.Avatar2;
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR;
using Node = UnityEngine.XR.XRNode;

public class CustomAvatarInputManager : OvrAvatarInputManager
{
    [SerializeField] private DependencyServer dependencyServer;
    [SerializeField] private GrabEventDispatcher grabEvent;

    private OVRCameraRig _thisOvrCameraRig;

    private const string logScope = "sampleInput";

    private async void Start()
    {
        await UniTask.WaitUntil(() => OVRManager.instance != null);
        OVRManager.instance.trackingOriginType = OVRManager.TrackingOrigin.FloorLevel;
        OvrAvatarLog.LogInfo("Setting Tracking Origin to FloorLevel", logScope, this);

        var instances = new List<XRInputSubsystem>();
        SubsystemManager.GetSubsystems(instances);
        foreach (var instance in instances)
        {
            instance.TrySetTrackingOriginMode(TrackingOriginModeFlags.Floor);
        }
    }

    protected override void OnTrackingInitialized()
    {
        _thisOvrCameraRig = FindAnyObjectByType<OVRCameraRig>();

        if (BodyTrackingContext != null && BodyTrackingContext is OvrAvatarBodyTrackingContext bodyTracking)
        {
            Debug.Log("Input Tracking Delegate attached");

            bodyTracking.InputTrackingDelegate = new AvatarCustomInputTracking(_thisOvrCameraRig, dependencyServer, grabEvent);
            bodyTracking.InputControlDelegate = new SampleInputControlDelegate();
        }
    }

    public void SetWristOffset(Vector3 leftOffset, Vector3 rightOffset)
    {
        if (BodyTrackingContext != null && BodyTrackingContext is OvrAvatarBodyTrackingContext bodyTracking)
        {
            if (bodyTracking.InputTrackingDelegate is not MetaKartAvatarCustomInputTracking metakartTracking) return;

            metakartTracking.SetWristOffset(leftOffset, rightOffset);
        }
    }
}

internal class AvatarCustomInputTracking : IOvrAvatarInputTrackingDelegate
{
    private Transform _lHandTarget;
    private Transform _rHandTarget;

    private Vector3 _leftWristOffset;
    private Vector3 _rightWristOffset;

    private Transform _trackingSpace;
    private readonly OVRCameraRig _ovrCameraRig;

    public AvatarCustomInputTracking(OVRCameraRig ovrCameraRig, DependencyServer dependencyServer, GrabEventDispatcher grabEvent)
    {
        dependencyServer.AvatarTrackingSpace.OnModelUpdate += t => _trackingSpace = t;
        _ovrCameraRig = ovrCameraRig;
        grabEvent.OnGrabEvent += OnGrabControl;
    }

    public void SetWristOffset(Vector3 leftOffset, Vector3 rightOffset)
    {
        _leftWristOffset = leftOffset;
        _rightWristOffset = rightOffset;
    }

    private void OverrideLeftArmPose(Transform lHandTarget)
    {
        _lHandTarget = lHandTarget;
    }

    private void OverrideRightArmPose(Transform rHandTarget)
    {
        _rHandTarget = rHandTarget;
    }

    public bool GetInputTrackingState(out OvrAvatarInputTrackingState inputTrackingState)
    {
        inputTrackingState = default;

        bool leftControllerActive = false;
        bool rightControllerActive = false;
        if (OVRInput.GetActiveController() != OVRInput.Controller.Hands)
        {
            leftControllerActive = OVRInput.GetControllerOrientationTracked(OVRInput.Controller.LTouch);
            rightControllerActive = OVRInput.GetControllerOrientationTracked(OVRInput.Controller.RTouch);
        }

        if (_ovrCameraRig is not null)
        {
            inputTrackingState.headsetActive = true;
            inputTrackingState.leftControllerActive = leftControllerActive;
            inputTrackingState.rightControllerActive = rightControllerActive;
            inputTrackingState.leftControllerVisible = false;
            inputTrackingState.rightControllerVisible = false;
            inputTrackingState.headset = (CAPI.ovrAvatar2Transform)_ovrCameraRig.centerEyeAnchor;
            inputTrackingState.leftController = (CAPI.ovrAvatar2Transform)_ovrCameraRig.leftHandAnchor;
            inputTrackingState.rightController = (CAPI.ovrAvatar2Transform)_ovrCameraRig.rightHandAnchor;

            inputTrackingState = CheckHandGrab(inputTrackingState);

            inputTrackingState = AdjustHandOffset(inputTrackingState);

            return true;
        }
        else if (OVRNodeStateProperties.IsHmdPresent())
        {
            inputTrackingState.headsetActive = true;
            inputTrackingState.leftControllerActive = leftControllerActive;
            inputTrackingState.rightControllerActive = rightControllerActive;
            inputTrackingState.leftControllerVisible = true;
            inputTrackingState.rightControllerVisible = true;

            if (OVRNodeStateProperties.GetNodeStatePropertyVector3(Node.CenterEye, NodeStatePropertyType.Position,
                OVRPlugin.Node.EyeCenter, OVRPlugin.Step.Render, out var headPos))
            {
                inputTrackingState.headset.position = headPos;
            }
            else
            {
                inputTrackingState.headset.position = Vector3.zero;
            }

            if (OVRNodeStateProperties.GetNodeStatePropertyQuaternion(Node.CenterEye, NodeStatePropertyType.Orientation,
                OVRPlugin.Node.EyeCenter, OVRPlugin.Step.Render, out var headRot))
            {
                inputTrackingState.headset.orientation = headRot;
            }
            else
            {
                inputTrackingState.headset.orientation = Quaternion.identity;
            }

            inputTrackingState.headset.scale = Vector3.one;

            inputTrackingState.leftController.position = OVRInput.GetLocalControllerPosition(OVRInput.Controller.LTouch);
            inputTrackingState.leftController.orientation = OVRInput.GetLocalControllerRotation(OVRInput.Controller.LTouch);
            inputTrackingState.rightController.position = OVRInput.GetLocalControllerPosition(OVRInput.Controller.RTouch);
            inputTrackingState.rightController.orientation = OVRInput.GetLocalControllerRotation(OVRInput.Controller.RTouch);

            inputTrackingState = CheckHandGrab(inputTrackingState);

            inputTrackingState = AdjustHandOffset(inputTrackingState);

            inputTrackingState.leftController.scale = Vector3.one;
            inputTrackingState.rightController.scale = Vector3.one;
            return true;
        }

        return false;
    }

    private OvrAvatarInputTrackingState CheckHandGrab(OvrAvatarInputTrackingState inputTrackingState)
    {
        if (_lHandTarget != null)
        {
            Pose leftHandPose = GetTrackingSpacePose(_lHandTarget.position, _lHandTarget.rotation);
            inputTrackingState.leftController.position = leftHandPose.position;
            inputTrackingState.leftController.orientation = leftHandPose.rotation;
        }

        if (_rHandTarget != null)
        {
            Pose rightHandPose = GetTrackingSpacePose(_rHandTarget.position, _rHandTarget.rotation);
            
            inputTrackingState.rightController.position = rightHandPose.position;
            inputTrackingState.rightController.orientation = rightHandPose.rotation;
        }

        return inputTrackingState;
    }

    private OvrAvatarInputTrackingState AdjustHandOffset(OvrAvatarInputTrackingState inputTrackingState)
    {
        inputTrackingState.leftController.position += _leftWristOffset;
        inputTrackingState.rightController.position += _rightWristOffset;

        return inputTrackingState;
    }

    private Pose GetTrackingSpacePose(Vector3 worldPosition, Quaternion worldRotation)
    {
        Vector3 position = _ovrCameraRig.trackingSpace.InverseTransformPoint(worldPosition);
        Quaternion rotation = Quaternion.Inverse(_ovrCameraRig.trackingSpace.rotation) * worldRotation;

        return new Pose(position, rotation);
    }

    private void OnGrabControl(GrabEventStatus status, ControllerSide side, Transform transform)
    {
        switch (side)
        {
            case ControllerSide.Left:
                OverrideLeftArmPose(status != GrabEventStatus.GrabEnd ? transform : null);
                break;
            case ControllerSide.Right:
                OverrideRightArmPose(status != GrabEventStatus.GrabEnd ? transform : null);
                break;
        }
    }
}

 

 Hope this helps

the_sammer
Honored Guest

If you're prefab doesn't have the MecanimLegsAnimationController added to your avatar prefab then I'd recommend trying that. You might also need the LegLK toggled on, not sure

Ok, I ended up getting legs animating by adding OvrAvatarAnimationBehaviour within the same object as my local avatars AvatarEntity, with the customRigPrefab field referencing the HumanoidAvatarRigVariant prefab. I found LegLK in creationInfo.features to not be necessary.
Also be careful, this will only animate the legs of avatars that are controlled by your avatar, so if you turn on 3rd person perspective on your own avatar then this method will not work on your own avatar, or at least I don't know how to make it work.