Forum Discussion
Anonymous
3 years agoMeta Avatars: how to feed custom input?
Hi, I'm prototyping with the new Meta Avatars system and am using the SampleAvatarEntity script to instantiate fake avatars. I'd like to replay head pose, controller states, and/or hand states (p...
Evans_Taylor_Digital
3 years agoProtege
Did you get anywhere with this? I’m having similar issues
Anonymous
3 years agoI sure did. Here is my replay input manager (it also does a couple other things). I didn’t include the replay samples but they are just time stamped structs containing Matrix4x4s of all the relevant nodes and bones. Note that the hand bone matrices are *not* localToWorldMatrix. Rather, they are just the local transform, which is constructed as:
Matrix4x4.TRS(bone.transform.localPosition, bone.transform.localRotation, bone.transform.localScale)
Everything else is just the localToWorldMatrix (head and hands).
Also, avatarAnchor is something you don’t need. It’s just a transform of some object to which I want to parent my avatars. If this option is deselected, everything will be in world space.
```
/*
* ReplayAvatarInputManager.cs
*
* Implementation of a body tracking input manager for Meta Avatars. Reads pre-
* recorded avatar motion data (stored as head, hand, and hand bone transforms
* over time) and feeds them into the IK system.
*
* This script should be attached to the top-level avatar object. To preserve
* the spatial relationship between the avatar anchor and the avatar as it was
* during recording, make the avatar a child of the scene's avatar anchor and
* enable the avatar anchor local space option.
*
* At the time of this writing, hands appear to be handled as follows:
*
* 1. The hand anchors are used as the wrist position transforms. In the
* typical OVR camera rig, the hand anchor is the parent node of the
* hand prefab. The hierarchy looks like this:
*
* OVRCameraRig
* |
* +-- TrackingSpace
* |
* +-- LeftHandAnchor
* |
* +-- LeftControllerAnchor
* |
* +-- OVRHandPrefabLeft
* |
* +-- Bones
* |
* +-- Wrist
* |
* ... Various bones
*
* The wrist transform is actually ignored; LeftHandAnchor is used as the
* left wrist.
*
* 2. The coordinate space of the wrist transform is converted to a right-
* handed system by negating the z component of the position and the x and
* y components of the rotation.
* 3. The OVR hand prefab has a non-unit scaling that is based on the user's
* hand size and varies over time. This does not appear to be used by the
* avatars. Therefore, either the localToWorldMatrix of the node "Bones"
* or just "LeftHandAnchor" are safe to use. Neither the hand prefab node
* nor the bones root node apply any translation or rotation.
* 4. The various individual hand bones that the avatar system accepts are
* specified only as rotations and must be converted to the appropriate
* space as well. Strangely, for reasons I don't understand, the conversion
* is different than the ConvertSpace() function present in the
* Oculus.Avatars2 code. Rather than negating x and y (for a conversion
* between LHS and RHS), it is y and z that must be negated.
*
* This information was discovered by examining what the default hand tracking
* tracking delegate feeds into the system. Unfortunately, the hand delegate is
* not implemented in C# and implements an interface
* IOvrAvatarNativeHandDelegate that indicates to the system there is a native
* callback present. In OvrBodyTrackingContext, this interface is detected and
* a native context is set up. Fortunately, HandTrackingDelegate
* (OvrPluginTracking.cs) implements the GetHandData() method, which appears to
* call the native system. By removing the conformance to
* IOvrAvatarNativeHandDelegate, we can obtain the IK input values and compare
* them to the OVR rig to deduce the transformations required.
*/
using System;
using System.IO;
using System.Collections.Generic;
using Oculus.Avatar2;
using UnityEngine;
public class ReplayAvatarInputManager : OvrAvatarInputManager, IReplayTimeProvider
{
public enum DebugTextMode
{
None, // display nothing
PlaybackTime // playback time in seconds
}
[Tooltip("Avatar pose recording to play back.")]
[SerializeField]
private TextAsset _poseRecording;
[Tooltip("Avatar audio recording to play back.")]
[SerializeField]
private AudioClip _audioRecording;
[Tooltip("Avatar pose recording to play back (loads from file system; used only if resource path is empty and will only work in Editor).")]
[SerializeField]
private string _poseRecordingFilepath = "c:\\Users\\Bart\\AppData\\LocalLow\\Cambrian Moment\\Test-MRTK\\AvatarPoseRecording.txt";
[Tooltip("Avatar audio recording to play back (loads from file system; used only if resource path is empty and will only work in Editor).")]
[SerializeField]
private string _audioRecordingFilepath = "c:\\Users\\Bart\\AppData\\LocalLow\\Cambrian Moment\\Test-MRTK\\MicrophoneAudio.wav";
[Tooltip("If checked, converts all poses into the local coordinate space of the avatar anchor stored in the recording. Make the avatar game object this script is attached to a child of the local avatar anchor and the relative positioning will be preserved.")]
[SerializeField]
private bool _useAvatarAnchorLocalSpace = true;
[SerializeField]
private TMPro.TextMeshPro _debugText;
[SerializeField]
private DebugTextMode _debugTextMode = DebugTextMode.None;
// Recording state
private List<AvatarRecordingSample> _samples = new List<AvatarRecordingSample>();
private int _playbackIdx = -1;
private float _playbackStartedAt = 0;
// As recording is played back, latest IK inputs are written here
private OvrAvatarInputTrackingState _currentInputTracking = new OvrAvatarInputTrackingState();
private OvrAvatarTrackingHandsState _currentTrackingHands = new OvrAvatarTrackingHandsState();
// Audio component for playback
private AudioSource _audio;
public float ReplayTime
{
get { return (_playbackIdx >= 0 && _playbackIdx < _samples.Count) ? (Time.time - _playbackStartedAt) : -1; }
}
private class ReplayHandTrackingDelegate : IOvrAvatarHandTrackingDelegate
{
private Func<OvrAvatarTrackingHandsState> _GetCurrentTrackingHandsState;
public ReplayHandTrackingDelegate(Func<OvrAvatarTrackingHandsState> GetCurrentTrackingHandsState)
{
_GetCurrentTrackingHandsState = GetCurrentTrackingHandsState;
}
public bool GetHandData(OvrAvatarTrackingHandsState handData)
{
OvrAvatarTrackingHandsState currentTrackingHands = _GetCurrentTrackingHandsState();
handData.isConfidentLeft = currentTrackingHands.isConfidentLeft;
handData.isConfidentRight = currentTrackingHands.isConfidentRight;
handData.isTrackedLeft = currentTrackingHands.isTrackedLeft;
handData.isTrackedRight = currentTrackingHands.isTrackedRight;
handData.wristPosLeft = currentTrackingHands.wristPosLeft;
handData.wristPosRight = currentTrackingHands.wristPosRight;
if (handData.boneRotations.Length == currentTrackingHands.boneRotations.Length)
{
for (int i = 0; i < handData.boneRotations.Length; i++)
{
handData.boneRotations[i] = currentTrackingHands.boneRotations[i];
}
}
return true;
}
}
private class ReplayInputTrackingDelegate : OvrAvatarInputTrackingDelegate
{
private Func<OvrAvatarInputTrackingState> _GetCurrentInputTrackingState;
public ReplayInputTrackingDelegate(Func<OvrAvatarInputTrackingState> GetCurrentInputTrackingState)
{
_GetCurrentInputTrackingState = GetCurrentInputTrackingState;
}
public override bool GetRawInputTrackingState(out OvrAvatarInputTrackingState inputTrackingState)
{
inputTrackingState = _GetCurrentInputTrackingState();
return true;
}
}
private class ReplayInputControlDelegate : OvrAvatarInputControlDelegate
{
public override bool GetInputControlState(out OvrAvatarInputControlState inputControlState)
{
inputControlState = new OvrAvatarInputControlState();
inputControlState.type = GetControllerType();
inputControlState.leftControllerState.isActive = false;
inputControlState.leftControllerState.isVisible = false;
inputControlState.rightControllerState.isActive = false;
inputControlState.rightControllerState.isVisible = false;
return true;
}
}
private void Start()
{
_audio = GetComponent<AudioSource>();
LoadRecording();
if (_useAvatarAnchorLocalSpace)
{
ConvertRecordingToAvatarAnchorLocalSpace();
}
if (BodyTracking != null)
{
BodyTracking.InputTrackingDelegate = new ReplayInputTrackingDelegate(() => _currentInputTracking);
BodyTracking.InputControlDelegate = new ReplayInputControlDelegate();
BodyTracking.HandTrackingDelegate = new ReplayHandTrackingDelegate(() => _currentTrackingHands);
}
}
private void OnEnable()
{
}
private void OnDisable()
{
_audio?.Stop();
}
protected override void OnDestroyCalled()
{
base.OnDestroyCalled();
}
private void LateUpdate()
{
// Must have samples to play back
if (_samples.Count <= 0)
{
return;
}
if (_playbackIdx >= _samples.Count || _playbackIdx < 0)
{
// We have finished playback (or have not yet started). Reset.
ResetPlayback();
}
float currentTime = ReplayTime;
UpdateDebugText(currentTime);
// Update audio
if (_audio && !_audio.isPlaying)
{
// We are within the recording time frame but audio is not playing.
// This indicates we have returned to a channel mid-playback.
_audio.time = currentTime < _audio.clip.length ? currentTime : 0;
_audio.Play();
}
// Hand pose sample
if (currentTime < _samples[_playbackIdx].time)
{
// Not ready to play back next sample yet
return;
}
while (_playbackIdx < _samples.Count && currentTime >= _samples[_playbackIdx].time)
{
// Fast forward through completed time samples
_playbackIdx += 1;
}
SetPose(_playbackIdx - 1);
}
private void ResetPlayback()
{
_playbackIdx = 0;
_playbackStartedAt = Time.time;
if (_audio && _audio.clip != null)
{
_audio.Stop();
_audio.time = 0;
_audio.Play();
}
}
private void LoadRecording()
{
// Pose recording
try
{
string[] lines = _poseRecording == null
? File.ReadAllLines(_poseRecordingFilepath)
: _poseRecording.text.Split(new string[] { Environment.NewLine }, StringSplitOptions.RemoveEmptyEntries);
foreach (string line in lines)
{
_samples.Add(new AvatarRecordingSample(line));
}
Debug.LogFormat("Loaded pose recording from {0}", _poseRecording == null ? _poseRecordingFilepath : "text resource");
}
catch (Exception e)
{
Debug.LogException(e);
}
// Audio
if (_audio)
{
if (_audioRecording)
{
_audio.clip = _audioRecording;
Debug.Log("Loaded audio clip from resources");
}
else
{
LoadAudioClipFromFileSystem();
}
}
}
private void LoadAudioClipFromFileSystem()
{
try
{
_audio.clip = WavFile.Load(_audioRecordingFilepath);
Debug.LogFormat("Loaded audio clip from {0}", _audioRecordingFilepath);
}
catch (Exception e)
{
Debug.LogException(e);
}
}
private void UpdateDebugText(float currentTime)
{
if (_debugText == null)
{
return;
}
switch (_debugTextMode)
{
default:
case DebugTextMode.None:
_debugText.enabled = false;
break;
case DebugTextMode.PlaybackTime:
_debugText.enabled = true;
_debugText.text = string.Format("{0:f2}", currentTime);
break;
}
}
private void SetPose(int idx)
{
UpdateInputTracking(idx);
UpdateHandTracking(idx);
}
private void UpdateHandTracking(int idx)
{
/*
* Hand bones are recorded as:
*
* 0. wrist
* 1. left thumb 0 (meta)
* 2. left thumb 1 (proximal)
* 3. left thumb 2 (intermediate)
* 4. left thumb 3 (distal)
* 5. left index 1 (proximal)
* 6. left index 2 (intermediate)
* 7. left index 3 (distal)
* 8. left middle 1 (proximal)
* 9. left middle 2 (intermediate)
* 10. left middle 3 (distal)
* 11. left ring 1 (proximal)
* 12. left ring 2 (intermediate)
* 13. left ring 3 (distal)
* 14. left pinky 0 (meta)
* 15. left pinky 1 (proximal)
* 16. left pinky 2 (intermediate)
* 17. left pinky 3 (distal)
*
* The bone rotations passed to the avatars IK system are stored in an
* array of 34 values consisting of left hand and right hand bones, in
* the same order but excluding wrists. That is:
*
* 0. left thumb 0 (meta)
* 1. left thumb 1 (proximal)
* ...
* 16. left pinky 3 (distal)
* 17. right thumb 0 (meta)
* ...
* 33. right pinky 3 (distal)
*/
// Assume hand tracking is enabled (if we want to support controller-
// only recordings, these should be set false)
_currentTrackingHands.isConfidentLeft = true;
_currentTrackingHands.isConfidentRight = true;
_currentTrackingHands.isTrackedLeft = true;
_currentTrackingHands.isTrackedRight = true;
// Wrist positions (these are just the hand anchors). These can be
// converted using the built-in RHS<->LHS conversion (negate z
// translation component, negate quaternion x and y).
_currentTrackingHands.wristPosLeft = new CAPI.ovrAvatar2Transform(_samples[idx].leftHandPose.Translation(), _samples[idx].leftHandPose.Rotation(), _samples[idx].leftHandPose.Scale()).ConvertSpace();
_currentTrackingHands.wristPosRight = new CAPI.ovrAvatar2Transform(_samples[idx].rightHandPose.Translation(), _samples[idx].rightHandPose.Rotation(), _samples[idx].rightHandPose.Scale()).ConvertSpace();
// Hand bones
bool correctNumberOfBones =
(_samples[idx].leftHandBones.Count == _samples[idx].rightHandBones.Count) &&
(_currentTrackingHands.boneRotations.Length == _samples[idx].leftHandBones.Count + _samples[idx].rightHandBones.Count - 2); // all bones excluding wrists
if (correctNumberOfBones)
{
int numJoints = _samples[idx].leftHandBones.Count - 1;
for (int i = 0; i < numJoints; i++)
{
Quaternion leftJoint = ConvertJointRotationToCAPI(_samples[idx].leftHandBones[i + 1].Rotation()); // shifted over by 1 because we ignore recorded index 0 (wrist)
Quaternion rightJoint = ConvertJointRotationToCAPI(_samples[idx].rightHandBones[i + 1].Rotation()); // shifted over by 1 because we ignore recorded index 0 (wrist)
_currentTrackingHands.boneRotations[i] = leftJoint;
_currentTrackingHands.boneRotations[numJoints + i] = rightJoint;
}
}
else
{
for (int i = 0; i < _currentTrackingHands.boneRotations.Length; i++)
{
_currentTrackingHands.boneRotations[i] = Quaternion.identity;
}
}
}
private void UpdateInputTracking(int idx)
{
_currentInputTracking.headsetActive = true;
_currentInputTracking.leftControllerActive = false;
_currentInputTracking.rightControllerActive = false;
_currentInputTracking.leftControllerVisible = false;
_currentInputTracking.rightControllerVisible = false;
_currentInputTracking.headset.position = _samples[idx].headPose.Translation();
_currentInputTracking.headset.orientation = _samples[idx].headPose.Rotation();
_currentInputTracking.headset.scale = Vector3.one;
_currentInputTracking.leftController.position = _samples[idx].leftHandPose.Translation();
_currentInputTracking.rightController.position = _samples[idx].rightHandPose.Translation();
_currentInputTracking.leftController.orientation = _samples[idx].leftHandPose.Rotation();
_currentInputTracking.rightController.orientation = _samples[idx].rightHandPose.Rotation();
_currentInputTracking.leftController.scale = _samples[idx].leftHandPose.Scale();
_currentInputTracking.rightController.scale = _samples[idx].rightHandPose.Scale();
}
private void ConvertRecordingToAvatarAnchorLocalSpace()
{
for (int i = 0; i < _samples.Count; i++)
{
var sample = _samples[i];
sample.headPose = ConvertPoseFromWorldToAvatarAnchorSpace(sample.headPose, sample.anchorPose);
sample.leftHandPose = ConvertPoseFromWorldToAvatarAnchorSpace(sample.leftHandPose, sample.anchorPose);
sample.rightHandPose = ConvertPoseFromWorldToAvatarAnchorSpace(sample.rightHandPose, sample.anchorPose);
_samples[i] = sample;
}
}
/// <summary>
/// Converts a joint rotation to the Meta Avatar C API coordinate space.
/// It is unknown why this conversion is different from the ConvertSpace()
/// function in Oculus.Avatars2.
/// </summary>
/// <param name="q">
/// Quaternion representing a hand joint rotation in its
/// local frame.
/// </param>
/// <returns>Converted rotation.</returns>
private Quaternion ConvertJointRotationToCAPI(Quaternion q)
{
return new Quaternion(q.x, -q.y, -q.z, q.w);
}
/// <summary>
/// Converts a transform matrix from world space to the local coordinate
/// space of the avatar anchor.
/// system
/// </summary>
/// <param name="pose">Transform (local to world) to convert.</param>
/// <param name="anchorPose">
/// Avatar anchor transform(local to world) into whose space we will
/// convert the pose.
/// </param>
/// <returns>Pose matrix in the anchor's local coordinate space.</returns>
private Matrix4x4 ConvertPoseFromWorldToAvatarAnchorSpace(Matrix4x4 pose, Matrix4x4 anchorPose)
{
return anchorPose.inverse * pose;
}
}
```
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 7 months ago
- 2 months ago
- 13 days ago
- 9 months ago