cancel
Showing results for 
Search instead for 
Did you mean: 

Meta Avatars: how to feed custom input?

Anonymous
Not applicable

Hi,

 

I'm prototyping with the new Meta Avatars system and am using the SampleAvatarEntity script to instantiate fake avatars. I'd like to replay head pose, controller states, and/or hand states (positions) in order to drive these avatars from pre-recorded data in my scene.

 

Using SampleInputManager (included with the avatar examples) as a starting point, I assumed all I had to do was provide my own InputTrackingDelegate, InputControlDelegate, and (for hand tracking), HandTrackingDelegate. I started with this, feeding in fake controller inputs and marking the controllers as active but it has absolutely no effect. The avatar spawns in t-pose and does nothing.

 

public class ReplayAvatarInputManager : OvrAvatarInputManager
{
    protected void Awake()
    {
    }

    private void Start()
    {
       if (BodyTracking != null)
       {
            BodyTracking.InputTrackingDelegate = new ReplayInputTrackingDelegate();
            BodyTracking.InputControlDelegate = new SampleInputControlDelegate();
        }
    }

    protected override void OnDestroyCalled()
    {
        base.OnDestroyCalled();
    }
}

public class ReplayInputTrackingDelegate : OvrAvatarInputTrackingDelegate
{
    public ReplayInputTrackingDelegate()
    {
    }

    public override bool GetRawInputTrackingState(out OvrAvatarInputTrackingState inputTrackingState)
    {
        inputTrackingState = new OvrAvatarInputTrackingState();
        inputTrackingState.headsetActive = true;
        inputTrackingState.leftControllerActive = true;
        inputTrackingState.rightControllerActive = true;
        inputTrackingState.leftControllerVisible = true;
        inputTrackingState.rightControllerVisible = true;
        inputTrackingState.headset.position = Vector3.zero;
        inputTrackingState.headset.orientation = Quaternion.identity;
        inputTrackingState.headset.scale = Vector3.one;
        inputTrackingState.leftController.position = 2*Vector3.one;
        inputTrackingState.rightController.position = Vector3.one;
        inputTrackingState.leftController.orientation = Quaternion.identity;
        inputTrackingState.rightController.orientation = Quaternion.identity;
        inputTrackingState.leftController.scale = Vector3.one;
        inputTrackingState.rightController.scale = Vector3.one;
        return true;
    }
}

 

The delegate method is called, I've confirmed, but the data it returns is not being used. What am I doing wrong? I've made sure to connect this ReplayAvatarInputManager behavior to the Tracking Input property on SampleAvatarEntity.

 

Thanks,

 

Bart

14 REPLIES 14

Did you get anywhere with this? I’m having similar issues

Anonymous
Not applicable

I sure did. Here is my replay input manager (it also does a couple other things). I didn’t include the replay samples but they are just time stamped structs containing Matrix4x4s of all the relevant nodes and bones. Note that the hand bone matrices are *not* localToWorldMatrix. Rather, they are just the local transform, which is constructed as:

 

Matrix4x4.TRS(bone.transform.localPosition, bone.transform.localRotation, bone.transform.localScale)

 

Everything else is just the localToWorldMatrix (head and hands).

 

Also, avatarAnchor is something you don’t need. It’s just a transform of some object to which I want to parent my avatars. If this option is deselected, everything will be in world space.

 

```

/*
 * ReplayAvatarInputManager.cs
 *
 * Implementation of a body tracking input manager for Meta Avatars. Reads pre-
 * recorded avatar motion data (stored as head, hand, and hand bone transforms
 * over time) and feeds them into the IK system.
 * 
 * This script should be attached to the top-level avatar object. To preserve
 * the spatial relationship between the avatar anchor and the avatar as it was
 * during recording, make the avatar a child of the scene's avatar anchor and
 * enable the avatar anchor local space option.
 * 
 * At the time of this writing, hands appear to be handled as follows:
 * 
 * 1. The hand anchors are used as the wrist position transforms. In the
 *    typical OVR camera rig, the hand anchor is the parent node of the
 *    hand prefab. The hierarchy looks like this:
 *    
 *    OVRCameraRig
 *    |
 *    +-- TrackingSpace
 *        |
 *        +-- LeftHandAnchor
 *            |
 *            +-- LeftControllerAnchor
 *            |
 *            +-- OVRHandPrefabLeft
 *                |
 *                +-- Bones
 *                    |
 *                    +-- Wrist
 *                        |
 *                        ... Various bones
 *
 *     The wrist transform is actually ignored; LeftHandAnchor is used as the
 *     left wrist.
 *
 *  2. The coordinate space of the wrist transform is converted to a right-
 *     handed system by negating the z component of the position and the x and
 *     y components of the rotation.
 *  3. The OVR hand prefab has a non-unit scaling that is based on the user's
 *     hand size and varies over time. This does not appear to be used by the
 *     avatars. Therefore, either the localToWorldMatrix of the node "Bones"
 *     or just "LeftHandAnchor" are safe to use. Neither the hand prefab node
 *     nor the bones root node apply any translation or rotation.
 *  4. The various individual hand bones that the avatar system accepts are
 *     specified only as rotations and must be converted to the appropriate
 *     space as well. Strangely, for reasons I don't understand, the conversion
 *     is different than the ConvertSpace() function present in the
 *     Oculus.Avatars2 code. Rather than negating x and y (for a conversion
 *     between LHS and RHS), it is y and z that must be negated.
 *
 * This information was discovered by examining what the default hand tracking
 * tracking delegate feeds into the system. Unfortunately, the hand delegate is
 * not implemented in C# and implements an interface
 * IOvrAvatarNativeHandDelegate that indicates to the system there is a native
 * callback present. In OvrBodyTrackingContext, this interface is detected and
 * a native context is set up. Fortunately, HandTrackingDelegate 
 * (OvrPluginTracking.cs) implements the GetHandData() method, which appears to
 * call the native system. By removing the conformance to 
 * IOvrAvatarNativeHandDelegate, we can obtain the IK input values and compare
 * them to the OVR rig to deduce the transformations required.
 */

using System;
using System.IO;
using System.Collections.Generic;
using Oculus.Avatar2;
using UnityEngine;

public class ReplayAvatarInputManager : OvrAvatarInputManager, IReplayTimeProvider
{
    public enum DebugTextMode
    {
        None,           // display nothing
        PlaybackTime    // playback time in seconds
    }

    [Tooltip("Avatar pose recording to play back.")]
    [SerializeField]
    private TextAsset _poseRecording;

    [Tooltip("Avatar audio recording to play back.")]
    [SerializeField]
    private AudioClip _audioRecording;

    [Tooltip("Avatar pose recording to play back (loads from file system; used only if resource path is empty and will only work in Editor).")]
    [SerializeField]
    private string _poseRecordingFilepath = "c:\\Users\\Bart\\AppData\\LocalLow\\Cambrian Moment\\Test-MRTK\\AvatarPoseRecording.txt";

    [Tooltip("Avatar audio recording to play back (loads from file system; used only if resource path is empty and will only work in Editor).")]
    [SerializeField]
    private string _audioRecordingFilepath = "c:\\Users\\Bart\\AppData\\LocalLow\\Cambrian Moment\\Test-MRTK\\MicrophoneAudio.wav";

    [Tooltip("If checked, converts all poses into the local coordinate space of the avatar anchor stored in the recording. Make the avatar game object this script is attached to a child of the local avatar anchor and the relative positioning will be preserved.")]
    [SerializeField]
    private bool _useAvatarAnchorLocalSpace = true;

    [SerializeField]
    private TMPro.TextMeshPro _debugText;

    [SerializeField]
    private DebugTextMode _debugTextMode = DebugTextMode.None;

    // Recording state
    private List<AvatarRecordingSample> _samples = new List<AvatarRecordingSample>();
    private int _playbackIdx = -1;
    private float _playbackStartedAt = 0;

    // As recording is played back, latest IK inputs are written here
    private OvrAvatarInputTrackingState _currentInputTracking = new OvrAvatarInputTrackingState();
    private OvrAvatarTrackingHandsState _currentTrackingHands = new OvrAvatarTrackingHandsState();

    // Audio component for playback
    private AudioSource _audio;

    public float ReplayTime
    {
        get { return (_playbackIdx >= 0 && _playbackIdx < _samples.Count) ? (Time.time - _playbackStartedAt) : -1; }
    }

    private class ReplayHandTrackingDelegate : IOvrAvatarHandTrackingDelegate
    {
        private Func<OvrAvatarTrackingHandsState> _GetCurrentTrackingHandsState;

        public ReplayHandTrackingDelegate(Func<OvrAvatarTrackingHandsState> GetCurrentTrackingHandsState)
        {
            _GetCurrentTrackingHandsState = GetCurrentTrackingHandsState;
        }

        public bool GetHandData(OvrAvatarTrackingHandsState handData)
        {
            OvrAvatarTrackingHandsState currentTrackingHands = _GetCurrentTrackingHandsState();
            handData.isConfidentLeft = currentTrackingHands.isConfidentLeft;
            handData.isConfidentRight = currentTrackingHands.isConfidentRight;
            handData.isTrackedLeft = currentTrackingHands.isTrackedLeft;
            handData.isTrackedRight = currentTrackingHands.isTrackedRight;
            handData.wristPosLeft = currentTrackingHands.wristPosLeft;
            handData.wristPosRight = currentTrackingHands.wristPosRight;
            if (handData.boneRotations.Length == currentTrackingHands.boneRotations.Length)
            {
                for (int i = 0; i < handData.boneRotations.Length; i++)
                {
                    handData.boneRotations[i] = currentTrackingHands.boneRotations[i];
                }
            }
            return true;
        }
    }

    private class ReplayInputTrackingDelegate : OvrAvatarInputTrackingDelegate
    {
        private Func<OvrAvatarInputTrackingState> _GetCurrentInputTrackingState;

        public ReplayInputTrackingDelegate(Func<OvrAvatarInputTrackingState> GetCurrentInputTrackingState)
        {
            _GetCurrentInputTrackingState = GetCurrentInputTrackingState;
        }

        public override bool GetRawInputTrackingState(out OvrAvatarInputTrackingState inputTrackingState)
        {
            inputTrackingState = _GetCurrentInputTrackingState();
            return true;
        }
    }

    private class ReplayInputControlDelegate : OvrAvatarInputControlDelegate
    {
        public override bool GetInputControlState(out OvrAvatarInputControlState inputControlState)
        {
            inputControlState = new OvrAvatarInputControlState();
            inputControlState.type = GetControllerType();
            inputControlState.leftControllerState.isActive = false;
            inputControlState.leftControllerState.isVisible = false;
            inputControlState.rightControllerState.isActive = false;
            inputControlState.rightControllerState.isVisible = false;
            return true;
        }
    }

    private void Start()
    {
        _audio = GetComponent<AudioSource>();

        LoadRecording();
        if (_useAvatarAnchorLocalSpace)
        {
            ConvertRecordingToAvatarAnchorLocalSpace();
        }

        if (BodyTracking != null)
        {
            BodyTracking.InputTrackingDelegate = new ReplayInputTrackingDelegate(() => _currentInputTracking);
            BodyTracking.InputControlDelegate = new ReplayInputControlDelegate();
            BodyTracking.HandTrackingDelegate = new ReplayHandTrackingDelegate(() => _currentTrackingHands);
        }
    }

    private void OnEnable()
    {
    }

    private void OnDisable()
    {
        _audio?.Stop();
    }

    protected override void OnDestroyCalled()
    {
        base.OnDestroyCalled();
    }

    private void LateUpdate()
    {
        // Must have samples to play back
        if (_samples.Count <= 0)
        {
            return;
        }

        if (_playbackIdx >= _samples.Count || _playbackIdx < 0)
        {
            // We have finished playback (or have not yet started). Reset.
            ResetPlayback();
        }

        float currentTime = ReplayTime;
        UpdateDebugText(currentTime);

        // Update audio
        if (_audio && !_audio.isPlaying)
        {
            // We are within the recording time frame but audio is not playing.
            // This indicates we have returned to a channel mid-playback.
            _audio.time = currentTime < _audio.clip.length ? currentTime : 0;
            _audio.Play();
        }

        // Hand pose sample
        if (currentTime < _samples[_playbackIdx].time)
        {
            // Not ready to play back next sample yet
            return;
        }
        while (_playbackIdx < _samples.Count && currentTime >= _samples[_playbackIdx].time)
        {
            // Fast forward through completed time samples
            _playbackIdx += 1;
        }
        SetPose(_playbackIdx - 1);
    }

    private void ResetPlayback()
    {
        _playbackIdx = 0;
        _playbackStartedAt = Time.time;
        if (_audio && _audio.clip != null)
        {
            _audio.Stop();
            _audio.time = 0;
            _audio.Play();
        }
    }

    private void LoadRecording()
    {
        // Pose recording
        try
        {
            string[] lines = _poseRecording == null 
                ? File.ReadAllLines(_poseRecordingFilepath)
                : _poseRecording.text.Split(new string[] { Environment.NewLine }, StringSplitOptions.RemoveEmptyEntries);
            foreach (string line in lines)
            {
                _samples.Add(new AvatarRecordingSample(line));
            }
            Debug.LogFormat("Loaded pose recording from {0}", _poseRecording == null ? _poseRecordingFilepath : "text resource");
        }
        catch (Exception e)
        {
            Debug.LogException(e);
        }

        // Audio
        if (_audio)
        {
            if (_audioRecording)
            {
                _audio.clip = _audioRecording;
                Debug.Log("Loaded audio clip from resources");
            }
            else
            {
                LoadAudioClipFromFileSystem();
            }
        }
    }

    private void LoadAudioClipFromFileSystem()
    {
        try
        {
            _audio.clip = WavFile.Load(_audioRecordingFilepath);
            Debug.LogFormat("Loaded audio clip from {0}", _audioRecordingFilepath);
        }
        catch (Exception e)
        {
            Debug.LogException(e);
        }
    }

    private void UpdateDebugText(float currentTime)
    {
        if (_debugText == null)
        {
            return;
        }

        switch (_debugTextMode)
        {
            default:
            case DebugTextMode.None:
                _debugText.enabled = false;
                break;
            case DebugTextMode.PlaybackTime:
                _debugText.enabled = true;
                _debugText.text = string.Format("{0:f2}", currentTime);
                break;
        }
    }

    private void SetPose(int idx)
    {
        UpdateInputTracking(idx);
        UpdateHandTracking(idx);
    }

    private void UpdateHandTracking(int idx)
    {
        /*
         * Hand bones are recorded as:
         *
         *   0. wrist
         *   1. left thumb 0 (meta)
         *   2. left thumb 1 (proximal)
         *   3. left thumb 2 (intermediate)
         *   4. left thumb 3 (distal)
         *   5. left index 1 (proximal)
         *   6. left index 2 (intermediate)
         *   7. left index 3 (distal)
         *   8. left middle 1 (proximal)
         *   9. left middle 2 (intermediate)
         *  10. left middle 3 (distal)
         *  11. left ring 1 (proximal)
         *  12. left ring 2 (intermediate)
         *  13. left ring 3 (distal)
         *  14. left pinky 0 (meta)
         *  15. left pinky 1 (proximal)
         *  16. left pinky 2 (intermediate)
         *  17. left pinky 3 (distal)
         *
         * The bone rotations passed to the avatars IK system are stored in an
         * array of 34 values consisting of left hand and right hand bones, in
         * the same order but excluding wrists. That is:
         * 
         *   0. left thumb 0 (meta)
         *   1. left thumb 1 (proximal)
         *  ...
         *  16. left pinky 3 (distal)
         *  17. right thumb 0 (meta)
         *  ...
         *  33. right pinky 3 (distal)
         */

        // Assume hand tracking is enabled (if we want to support controller-
        // only recordings, these should be set false)
        _currentTrackingHands.isConfidentLeft = true;
        _currentTrackingHands.isConfidentRight = true;
        _currentTrackingHands.isTrackedLeft = true;
        _currentTrackingHands.isTrackedRight = true;

        // Wrist positions (these are just the hand anchors). These can be
        // converted using the built-in RHS<->LHS conversion (negate z
        // translation component, negate quaternion x and y).
        _currentTrackingHands.wristPosLeft = new CAPI.ovrAvatar2Transform(_samples[idx].leftHandPose.Translation(), _samples[idx].leftHandPose.Rotation(), _samples[idx].leftHandPose.Scale()).ConvertSpace();
        _currentTrackingHands.wristPosRight = new CAPI.ovrAvatar2Transform(_samples[idx].rightHandPose.Translation(), _samples[idx].rightHandPose.Rotation(), _samples[idx].rightHandPose.Scale()).ConvertSpace();

        // Hand bones
        bool correctNumberOfBones = 
            (_samples[idx].leftHandBones.Count == _samples[idx].rightHandBones.Count) &&
            (_currentTrackingHands.boneRotations.Length == _samples[idx].leftHandBones.Count + _samples[idx].rightHandBones.Count - 2); // all bones excluding wrists
        if (correctNumberOfBones)
        {
            int numJoints = _samples[idx].leftHandBones.Count - 1;
            for (int i = 0; i < numJoints; i++)
            {
                Quaternion leftJoint = ConvertJointRotationToCAPI(_samples[idx].leftHandBones[i + 1].Rotation());   // shifted over by 1 because we ignore recorded index 0 (wrist)
                Quaternion rightJoint = ConvertJointRotationToCAPI(_samples[idx].rightHandBones[i + 1].Rotation()); // shifted over by 1 because we ignore recorded index 0 (wrist)
                _currentTrackingHands.boneRotations[i] = leftJoint;
                _currentTrackingHands.boneRotations[numJoints + i] = rightJoint;
            }
        }
        else
        {
            for (int i = 0; i < _currentTrackingHands.boneRotations.Length; i++)
            {
                _currentTrackingHands.boneRotations[i] = Quaternion.identity;
            }
        }
    }
    
    private void UpdateInputTracking(int idx)
    {
        _currentInputTracking.headsetActive = true;
        _currentInputTracking.leftControllerActive = false;
        _currentInputTracking.rightControllerActive = false;
        _currentInputTracking.leftControllerVisible = false;
        _currentInputTracking.rightControllerVisible = false;
        _currentInputTracking.headset.position = _samples[idx].headPose.Translation();
        _currentInputTracking.headset.orientation = _samples[idx].headPose.Rotation();
        _currentInputTracking.headset.scale = Vector3.one;
        _currentInputTracking.leftController.position = _samples[idx].leftHandPose.Translation();
        _currentInputTracking.rightController.position = _samples[idx].rightHandPose.Translation();
        _currentInputTracking.leftController.orientation = _samples[idx].leftHandPose.Rotation();
        _currentInputTracking.rightController.orientation = _samples[idx].rightHandPose.Rotation();
        _currentInputTracking.leftController.scale = _samples[idx].leftHandPose.Scale();
        _currentInputTracking.rightController.scale = _samples[idx].rightHandPose.Scale();
    }

    private void ConvertRecordingToAvatarAnchorLocalSpace()
    {
        for (int i = 0; i < _samples.Count; i++)
        {
            var sample = _samples[i];
            sample.headPose = ConvertPoseFromWorldToAvatarAnchorSpace(sample.headPose, sample.anchorPose);
            sample.leftHandPose = ConvertPoseFromWorldToAvatarAnchorSpace(sample.leftHandPose, sample.anchorPose);
            sample.rightHandPose = ConvertPoseFromWorldToAvatarAnchorSpace(sample.rightHandPose, sample.anchorPose);
            _samples[i] = sample;
        }
    }

    /// <summary>
    ///  Converts a joint rotation to the Meta Avatar C API coordinate space.
    ///  It is unknown why this conversion is different from the ConvertSpace()
    ///  function in Oculus.Avatars2.
    /// </summary>
    /// <param name="q">
    ///     Quaternion representing a hand joint rotation in its
    ///     local frame.
    /// </param>
    /// <returns>Converted rotation.</returns>
    private Quaternion ConvertJointRotationToCAPI(Quaternion q)
    {
        return new Quaternion(q.x, -q.y, -q.z, q.w);
    }

    /// <summary>
    /// Converts a transform matrix from world space to the local coordinate
    /// space of the avatar anchor.
    /// system 
    /// </summary>
    /// <param name="pose">Transform (local to world) to convert.</param>
    /// <param name="anchorPose">
    ///     Avatar anchor transform(local to world) into whose space we will
    ///     convert the pose.
    /// </param>
    /// <returns>Pose matrix in the anchor's local coordinate space.</returns>
    private Matrix4x4 ConvertPoseFromWorldToAvatarAnchorSpace(Matrix4x4 pose, Matrix4x4 anchorPose)
    {
        return anchorPose.inverse * pose;
    }
}
```

 

 

@Anonymous This is super useful, thanks for sharing.

 

My use case is for Avatar control and multiplayer avatar control for camera rigs other than the OVR Camera rig, I'm guessing that I can just create a new tracking delegate and feed it the correct poses in the same way that you have done for your replay system.

Anonymous
Not applicable

Yeah, this should help get you going for sure! Hopefully my code makes sense. The most important bits are in the functions called by SetPose().

ahmad.alsaleem.58
Honored Guest

This is great. Can you please share the data record code if you it?

Anonymous
Not applicable

Yes I can but it's very specific to my project and you'll have to do some work on your own to port it to your case. Some things to keep in mind:

 

1) I emit text files. These are super slow to load and will lock up your application for many seconds. I recommend moving to a binary format and also streaming in the file as chunks.

 

2) I use Normcore avatars in my project as well (legacy left over stuff I didn't remove), so you will see me looking for objects called "Head", "Left Hand", and "Right Hand". These are just driven directly by the OVRCameraRig's camera and controller components.

 

RecordAvatarClickHandler.cs:

 

/*
 * RecordAvatarClickHandler.cs
 */
using System.IO;
using System.Collections.Generic;
using System.Linq;
using UnityEngine;

using Normal.Realtime;

public class RecordAvatarClickHandler : MonoBehaviour
{
    [SerializeField]
    private KeyCode _editorSimulateRecordButtonClickedKey = KeyCode.R;

    private List<AvatarRecordingSample> _samples = new List<AvatarRecordingSample>();
    private bool _recording = false;
    private float _recordingStartedAt = 0;
    private Transform _anchor;
    private Transform _head;
    private Transform _leftHand;    // these are obtained from local anchor but must correspond to Oculus hand anchor position else Meta Avatars playback might be wrong
    private Transform _rightHand;

    private OVRSkeleton _ovrLeftHand;
    private OVRSkeleton _ovrRightHand;

    private void Start()
    {
        // Get Oculus hands
        foreach (var skeleton in FindObjectsOfType<OVRSkeleton>())
        {
            if (!skeleton.name.ToLower().Contains("hand"))
            {
                continue;
            }
            if (skeleton.name.ToLower().Contains("left"))
            {
                _ovrLeftHand = skeleton;
            }
            if (skeleton.name.ToLower().Contains("right"))
            {
                _ovrRightHand = skeleton;
            }
        }
    }

    public void OnButtonClicked()
    {
        _recording = !_recording;
#if UNITY_EDITOR
        if (_recording)
        {
            StartRecording();
        }
        else
        {
            StopRecording();
        }
#endif
    }

    private void StartRecording()
    {
        Debug.Log("Recording started");
        _samples.Clear();
        _recordingStartedAt = Time.time;
        _anchor = null;
        _head = null;
        _leftHand = null;
        _rightHand = null;
        FindLocalAvatarAndAnchor();
    }

    private void StopRecording()
    {
        //Trim();   // trim first and last second
        SaveToFile();
        _samples.Clear();
        Debug.Log("Recording stopped");
    }

    private void FindLocalAvatarAndAnchor()
    {
        Betterverse.RealtimeAvatarManager avatarManager = FindObjectOfType<Betterverse.RealtimeAvatarManager>();
        if (!avatarManager)
        {
            Debug.LogError("RealtimeAvatarManager not found in scene");
            return;
        }

        Betterverse.RealtimeAvatar ourAvatar = avatarManager.localAvatar;
        if (!ourAvatar)
        {
            Debug.LogError("No local avatar");
            return;
        }

        var anchor = FindObjectOfType<AvatarAnchor>();
        if (!anchor)
        {
            Debug.LogError("No avatar anchor in scene");
            return;
        }
        _anchor = anchor.transform;

        // For now, we assume avatar components are: SyncedToRemoteTransforms/{Head,LeftHand,RightHand}
        // These objects are driven by Normcore.
        Transform remoteTransforms = ourAvatar.transform.FindChildRecursive("SyncedToRemoteTransforms");
        if (!remoteTransforms)
        {
            Debug.LogError("Local avatar lacks expected SyncedToRemoteTransforms node");
            return;
        }

        _head = remoteTransforms.transform.FindChildRecursive("Head");
        _leftHand = remoteTransforms.transform.FindChildRecursive("Left Hand");
        _rightHand = remoteTransforms.transform.FindChildRecursive("Right Hand");
    }

    private void SaveToFile()
    {
        string filepath = Path.Combine(Application.persistentDataPath, "AvatarPoseRecording.txt");
        var lines = _samples.Select(sample => sample.Serialize());
        try
        {
            File.WriteAllLines(filepath, lines);
            Debug.LogFormat("Wrote recording to file: {0}", filepath);
        }
        catch (System.Exception e)
        {
            Debug.LogException(e);
        }
    }

    private void LateUpdate()
    {
#if UNITY_EDITOR
        if (Input.GetKeyDown(_editorSimulateRecordButtonClickedKey))
        {
            OnButtonClicked();
        }

        if (_recording)
        {
            Matrix4x4 anchorPose = Matrix4x4.identity;
            Matrix4x4 headPose = Matrix4x4.identity;
            Matrix4x4 leftHandPose = Matrix4x4.identity;
            Matrix4x4 rightHandPose = Matrix4x4.identity;
            
            if (_anchor)
            {
                anchorPose = _anchor.localToWorldMatrix;
            }

            if (_head)
            {
                headPose = _head.localToWorldMatrix;
            }

            if (_leftHand)
            {
                leftHandPose = _leftHand.localToWorldMatrix;
            }

            if (_rightHand)
            {
                rightHandPose = _rightHand.localToWorldMatrix;
            }

            AvatarRecordingSample sample = new AvatarRecordingSample();
            sample.time = Time.time - _recordingStartedAt;
            sample.leftHandVisible = _leftHand && _leftHand.gameObject.activeSelf;
            sample.rightHandVisible = _rightHand && _rightHand.gameObject.activeSelf;
            sample.anchorPose = anchorPose;
            sample.headPose = headPose;
            sample.leftHandPose = leftHandPose;
            sample.rightHandPose = rightHandPose;
            sample.leftHandBones = GetHandBonePoses(_ovrLeftHand);
            sample.rightHandBones = GetHandBonePoses(_ovrRightHand);
            _samples.Add(sample);
        }
#endif
    }

    private List<Matrix4x4> GetHandBonePoses(OVRSkeleton skeleton)
    {
        List<Matrix4x4> poses = new List<Matrix4x4>();
        if (skeleton != null)
        {
            OVRSkeleton.BoneId[] boneOrder = new OVRSkeleton.BoneId[]
            {
                // Make sure this is consistent with avatar playback code
                // (ReplayAvatarInputManager.cs)
                OVRSkeleton.BoneId.Hand_WristRoot,  // 0
                OVRSkeleton.BoneId.Hand_Thumb0,     // 1
                OVRSkeleton.BoneId.Hand_Thumb1,     // 2
                OVRSkeleton.BoneId.Hand_Thumb2,     // 3
                OVRSkeleton.BoneId.Hand_Thumb3,     // 4
                OVRSkeleton.BoneId.Hand_Index1,     // 5
                OVRSkeleton.BoneId.Hand_Index2,     // 6
                OVRSkeleton.BoneId.Hand_Index3,     // 7
                OVRSkeleton.BoneId.Hand_Middle1,    // 8
                OVRSkeleton.BoneId.Hand_Middle2,    // 9
                OVRSkeleton.BoneId.Hand_Middle3,    // 10
                OVRSkeleton.BoneId.Hand_Ring1,      // 11
                OVRSkeleton.BoneId.Hand_Ring2,      // 12
                OVRSkeleton.BoneId.Hand_Ring3,      // 13
                OVRSkeleton.BoneId.Hand_Pinky0,     // 14
                OVRSkeleton.BoneId.Hand_Pinky1,     // 15
                OVRSkeleton.BoneId.Hand_Pinky2,     // 16
                OVRSkeleton.BoneId.Hand_Pinky3      // 17
            };

            // Create transform matrices for each bone. We cannot use localToWorld because each bone is
            // relative to its parent in a hierarchy and we must preserve that. That is, Hand_Thumb1 is
            // a child of Hand_Thumb0, which is a child of Hand_WristRoot, etc.
            for (int i = 0; i < boneOrder.Length; i++)
            {
                Transform boneTransform = skeleton.Bones[(int)boneOrder[i]].Transform;
                Matrix4x4 transformMatrix = Matrix4x4.TRS(boneTransform.localPosition, boneTransform.localRotation, boneTransform.localScale);
                poses.Add(transformMatrix);
            }
        }
        return poses;
    }
}

 

 

AvatarRecordingSample.cs:

 

 

/*
 * AvatarRecordingSample.cs
 */

using System.Collections.Generic;
using UnityEngine;

public struct AvatarRecordingSample
{
    public float time;
    public bool leftHandVisible;
    public bool rightHandVisible;
    public Matrix4x4 anchorPose;
    public Matrix4x4 headPose;
    public Matrix4x4 leftHandPose;
    public Matrix4x4 rightHandPose;
    public List<Matrix4x4> leftHandBones;
    public List<Matrix4x4> rightHandBones;

    private void SerializeMatrix(List<string> output, Matrix4x4 matrix)
    {
        for (int i = 0; i < 4; i++)
        {
            output.Add(matrix.GetRow(i).x.ToString());
            output.Add(matrix.GetRow(i).y.ToString());
            output.Add(matrix.GetRow(i).z.ToString());
            output.Add(matrix.GetRow(i).w.ToString());
        }
    }

    private static Matrix4x4 DeserializeMatrix(string[] components, int startIdx)
    {
        Matrix4x4 matrix = Matrix4x4.zero;
        for (int i = 0; i < 4; i++)
        {
            float x = float.Parse(components[startIdx + i * 4 + 0]);
            float y = float.Parse(components[startIdx + i * 4 + 1]);
            float z = float.Parse(components[startIdx + i * 4 + 2]);
            float w = float.Parse(components[startIdx + i * 4 + 3]);
            matrix.SetRow(i, new Vector4(x, y, z, w));
        }
        return matrix;
    }

    public string Serialize()
    {
        List<string> components = new List<string>();
        components.Add(time.ToString());
        components.Add((leftHandVisible ? 1 : 0).ToString());
        components.Add((rightHandVisible ? 1 : 0).ToString());
        SerializeMatrix(components, anchorPose);
        SerializeMatrix(components, headPose);
        SerializeMatrix(components, leftHandPose);
        SerializeMatrix(components, rightHandPose);
        components.Add(leftHandBones.Count.ToString()); // number of left hand bones serialized
        foreach (Matrix4x4 pose in leftHandBones)
        {
            SerializeMatrix(components, pose);
        }
        components.Add(rightHandBones.Count.ToString()); // number of left hand bones serialized
        foreach (Matrix4x4 pose in rightHandBones)
        {
            SerializeMatrix(components, pose);
        }
        return string.Join(" ", components);
    }

    public AvatarRecordingSample(string serialized)
    {
        string[] components = serialized.Trim().Split(' ');
        time = float.Parse(components[0]);
        leftHandVisible = int.Parse(components[1]) != 0;
        rightHandVisible = int.Parse(components[2]) != 0;
        anchorPose = DeserializeMatrix(components, 3 + 0 * 16);
        headPose = DeserializeMatrix(components, 3 + 1 * 16);
        leftHandPose = DeserializeMatrix(components, 3 + 2 * 16);
        rightHandPose = DeserializeMatrix(components, 3 + 3 * 16);

        leftHandBones = new List<Matrix4x4>();
        rightHandBones = new List<Matrix4x4>();
        int idx = 3 + 4 * 16 + 0;

        if (idx < components.Length)
        {
            int numLeftHandBones = int.Parse(components[idx++]);
            for (int i = 0; i < numLeftHandBones; i++)
            {
                leftHandBones.Add(DeserializeMatrix(components, idx));
                idx += 16;
            }
        }

        if (idx < components.Length)
        {
            int numRightHandBones = int.Parse(components[idx++]);
            for (int i = 0; i < numRightHandBones; i++)
            {
                rightHandBones.Add(DeserializeMatrix(components, idx));
                idx += 16;
            }
        }

        // If the number of bones is not 18, clear
        if (leftHandBones.Count != 18)
        {
            leftHandBones.Clear();
        }
        if (rightHandBones.Count != 18)
        {
            rightHandBones.Clear();
        }
    }
}

 

 

Matrix4x4Extensions.cs:

 

using UnityEngine;

public static class Matrix4x4Extensions
{
    public static Vector3 Translation(this Matrix4x4 matrix)
    {
        Vector4 position = matrix.GetColumn(3);
        return new Vector3(position.x, position.y, position.z);
    }

    public static Vector3 Scale(this Matrix4x4 matrix)
    {
        Vector3 scale;
        scale.x = new Vector4(matrix.m00, matrix.m10, matrix.m20, matrix.m30).magnitude;
        scale.y = new Vector4(matrix.m01, matrix.m11, matrix.m21, matrix.m31).magnitude;
        scale.z = new Vector4(matrix.m02, matrix.m12, matrix.m22, matrix.m32).magnitude;
        return scale;
    }

    public static Quaternion Rotation(this Matrix4x4 matrix)
    {
        Vector3 forward;
        forward.x = matrix.m02;
        forward.y = matrix.m12;
        forward.z = matrix.m22;

        Vector3 up;
        up.x = matrix.m01;
        up.y = matrix.m11;
        up.z = matrix.m21;

        return Quaternion.LookRotation(forward, up);
    }

    public static Vector3 Forward(this Matrix4x4 matrix)
    {
        Vector4 column = matrix.GetColumn(2);
        return new Vector3(column.x, column.y, column.z).normalized;
    }

    public static Vector3 Up(this Matrix4x4 matrix)
    {
        Vector4 column = matrix.GetColumn(1);
        return new Vector3(column.x, column.y, column.z).normalized;
    }

    public static Vector3 Right(this Matrix4x4 matrix)
    {
        Vector4 column = matrix.GetColumn(0);
        return new Vector3(column.x, column.y, column.z).normalized;
    }
}

 

Good luck.

Anonymous
Not applicable

Oh yes and one more thing: all these references to an "avatar anchor" are probably confusing. The "anchor" is basically a point in space that I consider to be the origin for the avatar and I record things relative to this anchor point so that on the remote side, I can render the avatars relative to a movable anchor point. You won't need this and I think the code should be clear enough to modify to your use case.

Thanks, yeah I figured as much. I was able to get this working in the end. For my use case I just wanted to use a custom VR rig (not the OVRCamera Rig) and to be able to use a VR emulator.. so essentially what I wanted was to pass in a transform as a reference for the head, left hand and right hand - and have the body tracking follow these transforms.

 

Here's what I ended up with:

using Oculus.Avatar2;
using UnityEngine;
using Touch = OVRInput.Touch;

public class CustomAvatarInputManager : OvrAvatarInputManager
{
    [SerializeField] private Transform _tackingSpace;
    [SerializeField] private Transform _headTarget;
    [SerializeField] private Transform _leftHandTarget;
    [SerializeField] private Transform _rightHandTarget;


    private class CustomInputTrackingDelegate : OvrAvatarInputTrackingDelegate
    {
        private Transform _headTarget;
        private Transform _leftHandTarget;
        private Transform _rightHandTarget;
        private Transform _trackingSpace;

        public CustomInputTrackingDelegate(Transform headTarget, Transform leftHandTarget, Transform rightHandTarget, Transform trackingSpace)
        {
            _headTarget = headTarget;
            _leftHandTarget = leftHandTarget;
            _rightHandTarget = rightHandTarget;
            _trackingSpace = trackingSpace;
        }

        public override bool GetRawInputTrackingState(out OvrAvatarInputTrackingState inputTrackingState)
        {
            inputTrackingState = new OvrAvatarInputTrackingState();
            inputTrackingState.headsetActive = true;
            inputTrackingState.leftControllerActive = true;
            inputTrackingState.rightControllerActive = true;
            inputTrackingState.leftControllerVisible = false;
            inputTrackingState.rightControllerVisible = false;

            Pose headPose = GetTrackingSpacePose(_headTarget.position, _headTarget.rotation);
            Pose leftHandPose = GetTrackingSpacePose(_leftHandTarget.position, _leftHandTarget.rotation);
            Pose rightHandPose = GetTrackingSpacePose(_rightHandTarget.position, _rightHandTarget.rotation);

            inputTrackingState.headset.position = headPose.position;
            inputTrackingState.headset.orientation = headPose.rotation;
            inputTrackingState.headset.scale = Vector3.one;
            inputTrackingState.leftController.position = leftHandPose.position;
            inputTrackingState.rightController.position = rightHandPose.position;
            inputTrackingState.leftController.orientation = leftHandPose.rotation;
            inputTrackingState.rightController.orientation = rightHandPose.rotation;
            inputTrackingState.leftController.scale = Vector3.one;
            inputTrackingState.rightController.scale = Vector3.one;

            return true;
        }

        private Pose GetTrackingSpacePose(Vector3 worldPosition, Quaternion worldRotation)
        {
            Vector3 position = _trackingSpace.InverseTransformPoint(worldPosition);
            Quaternion rotation = Quaternion.Inverse(_trackingSpace.rotation) * worldRotation;

            return new Pose(position, rotation);
        }
    }

    private class CustomInputControlDelegate : OvrAvatarInputControlDelegate
    {
        public override bool GetInputControlState(out OvrAvatarInputControlState inputControlState)
        {
            inputControlState = new OvrAvatarInputControlState();
            inputControlState.type = GetControllerType();

            var input = BNG.InputBridge.Instance;

            //Button Press
            if (input.AButton)
                inputControlState.leftControllerState.buttonMask |= CAPI.ovrAvatar2Button.One;
            if (input.BButton)
                inputControlState.leftControllerState.buttonMask |= CAPI.ovrAvatar2Button.Two;
            if (input.XButton)
                inputControlState.rightControllerState.buttonMask |= CAPI.ovrAvatar2Button.One;
            if (input.YButton)
                inputControlState.rightControllerState.buttonMask |= CAPI.ovrAvatar2Button.Two;
            if (input.LeftThumbstick)
                inputControlState.leftControllerState.buttonMask |= CAPI.ovrAvatar2Button.Joystick;
            if (input.RightThumbstick)
                inputControlState.rightControllerState.buttonMask |= CAPI.ovrAvatar2Button.Joystick;

            // Left Controller Button Touch
            if (OVRInput.Get(Touch.One, OVRInput.Controller.LTouch))
            {
                inputControlState.leftControllerState.touchMask |= CAPI.ovrAvatar2Touch.One;
            }
            if (OVRInput.Get(Touch.Two, OVRInput.Controller.LTouch))
            {
                inputControlState.leftControllerState.touchMask |= CAPI.ovrAvatar2Touch.Two;
            }
            if (OVRInput.Get(Touch.PrimaryThumbstick, OVRInput.Controller.LTouch))
            {
                inputControlState.leftControllerState.touchMask |= CAPI.ovrAvatar2Touch.Joystick;
            }
            if (OVRInput.Get(Touch.PrimaryThumbRest, OVRInput.Controller.LTouch))
            {
                inputControlState.leftControllerState.touchMask |= CAPI.ovrAvatar2Touch.ThumbRest;
            }

            // Right Controller Button Touch
            if (OVRInput.Get(Touch.One, OVRInput.Controller.RTouch))
            {
                inputControlState.rightControllerState.touchMask |= CAPI.ovrAvatar2Touch.One;
            }
            if (OVRInput.Get(Touch.Two, OVRInput.Controller.RTouch))
            {
                inputControlState.rightControllerState.touchMask |= CAPI.ovrAvatar2Touch.Two;
            }
            if (OVRInput.Get(Touch.PrimaryThumbstick, OVRInput.Controller.RTouch))
            {
                inputControlState.rightControllerState.touchMask |= CAPI.ovrAvatar2Touch.Joystick;
            }
            if (OVRInput.Get(Touch.PrimaryThumbRest, OVRInput.Controller.RTouch))
            {
                inputControlState.rightControllerState.touchMask |= CAPI.ovrAvatar2Touch.ThumbRest;
            }
            

            // Left Trigger
            inputControlState.leftControllerState.indexTrigger = input.LeftTrigger;
            
            if (input.LeftTriggerNear)
            {
                inputControlState.leftControllerState.touchMask |= CAPI.ovrAvatar2Touch.Index;
            }
            else if (input.LeftTrigger <= 0f)
            {
                // TODO: Not sure if this is the correct way to do this
                inputControlState.leftControllerState.touchMask |= CAPI.ovrAvatar2Touch.Pointing;
            }

            // Right Trigger
            inputControlState.rightControllerState.indexTrigger = input.RightTrigger;
            
            if (input.RightTriggerNear)
            {
                inputControlState.rightControllerState.touchMask |= CAPI.ovrAvatar2Touch.Index;
            }
            else if (input.RightTrigger <= 0f)
            {
                // TODO: Not sure if this is the correct way to do this
                inputControlState.rightControllerState.touchMask |= CAPI.ovrAvatar2Touch.Pointing;
            }

            //Left Grip
            inputControlState.leftControllerState.handTrigger = input.LeftGrip;
            if(!input.LeftThumbNear)
                inputControlState.leftControllerState.touchMask |= CAPI.ovrAvatar2Touch.ThumbUp;
            
            //Right Grip
            inputControlState.rightControllerState.handTrigger = input.RightGrip;
            if(!input.RightThumbNear)
                inputControlState.rightControllerState.touchMask |= CAPI.ovrAvatar2Touch.ThumbUp;


            return true;
        }
    }

    private void Start()
    {
        if (BodyTracking != null)
        {
            // BodyTracking.InputTrackingDelegate = new CustomInputTrackingDelegate(() => _currentInputTracking);
            BodyTracking.InputTrackingDelegate = new CustomInputTrackingDelegate(_headTarget, _leftHandTarget, _rightHandTarget, _tackingSpace);
            BodyTracking.InputControlDelegate = new CustomInputControlDelegate();
        }
    }

    protected override void OnDestroyCalled()
    {
        base.OnDestroyCalled();
    }
}

 

 

There's some stuff in there specific to the BNG Framework - which is only useful if you're using that asset. But you could just replace the Input Control Delegate with your own version, or with the sample one and everything else will work fine

Hey sorry to necro this -- but I am doing something super simliar to you (BNG and Meta Avatars).  I was wondering if you'd have any more pointers/scripts you'd be willing to release to help out on that?  The above is a super helpful starting point, but when using the XR Advanced rig, the meta avatars arms still don't follow the emulator