Add animation to Avatar in Unity
I want to use avatars from Meta SDK as NPCs in an experience that I'm developing, using some Mixamo animations to make them look more alive. However I can't find a way to add an Animator controller to the OVRAvatarEntities. Has anyone done this?4.4KViews2likes6CommentsMeta Avatar Joint Transforms Are In Incorrect Position After Calling LoadUser() Again.
I think I've found a bug in the meta avatars sdk version 17.2. After calling LoadUser() on an avatar entity that has already been loaded before(which may happen if you'd want to update an avatar's look), the transforms Joint Head, Joint LeftHandWrist and Joint RightHandWrist stop being in the correct positions and simply fix themselves at 0, 0, 0. Here are the steps to reproduce it: In a blank scene add a AvatarSdkManagerHorizon object and a empty gameobject with a SampleAvatarEntity component. Set the SampleAvatarEntity's BodyTracking input to be the AvatarSdkManagerHorizon's SampleInputManager. Add some code in the SampleAvatarEntity which will enable you to call LoadUser() in runtime. Ensure you have UseStandalonePlatform checked in your OculusPlatformSettings so that your own avatar loads. Connect your headset with Quest Link and run the scene to let your avatar load. In the hierachy see how the Joint Head is in the correct place. Now manually call LoadUser() and see how Joint Head is no longer in the correct place.1.5KViews3likes1CommentHelp with Meta avatar playback animation (Unity2022.3.12, Oculus V57, URP)
So i am using the script from the loop back sample provided by meta, i have achieved record the animation and playback, but the animation is quite desynced with the audio that was recorded at the same time. For animation, i just save a list of processed PacketData, then reinsert them back to the avatar that i want to playback, releasing it at the end. For Audio recording, i just use the default unity recording. Everything comes out, animation played well and audio too, but as the time pass, maybe after 10 second, the animation becomes no long align with the audio that was recorded at the same time. Seems animation plays a bit slow. Is there any way to fix this? feels very sad since already figured the way to make it work(may still not be the right way though)941Views0likes1CommentMeta Avatars Loading Black and White without any textures
I am trying to load Meta avatars as local and remote players in a Multiplayer environment established by photon. I am able to load the meta avatars as local and remote players in one room but my meta avatars are displayed white and black.(i.e with out any shaders or textures). I am trying to instantiate the meta avatar as player when player joins the room using the user id. Below is my code: using System.Collections; using System.Collections.Generic; using UnityEngine; using Oculus.Avatar2; using Oculus.Platform; using Photon.Pun; using System; public class RemotePlayer : OvrAvatarEntity { [SerializeField] int m_avatarToUseInZipFolder = 2; PhotonView m_photonView; List<byte[]> m_streamedDataList = new List<byte[]>(); int m_maxBytesToLog = 15; [SerializeField] ulong m_instantiationData; float m_cycleStartTime = 0; float m_intervalToSendData = 0.08f; protected override void Awake() { ConfigureAvatarEntity(); base.Awake(); } private void Start() { m_instantiationData = GetUserIdFromPhotonInstantiationData(); _userId = m_instantiationData; StartCoroutine(TryToLoadUser()); } void ConfigureAvatarEntity() { m_photonView = GetComponent<PhotonView>(); if (m_photonView.IsMine) { SetIsLocal(true); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Default; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "MyAvatar"; } else { SetIsLocal(false); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Remote; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "OtherAvatar"; } } IEnumerator TryToLoadUser() { var hasAvatarRequest = OvrAvatarManager.Instance.UserHasAvatarAsync(_userId); while (hasAvatarRequest.IsCompleted == false) { yield return null; } LoadUser(); } private void LateUpdate() { float elapsedTime = Time.time - m_cycleStartTime; if (elapsedTime > m_intervalToSendData) { RecordAndSendStreamDataIfMine(); m_cycleStartTime = Time.time; } } void RecordAndSendStreamDataIfMine() { if (m_photonView.IsMine) { byte[] bytes = RecordStreamData(activeStreamLod); m_photonView.RPC("RecieveStreamData", RpcTarget.Others, bytes); } } [PunRPC] public void RecieveStreamData(byte[] bytes) { m_streamedDataList.Add(bytes); } void LogFirstFewBytesOf(byte[] bytes) { for (int i = 0; i < m_maxBytesToLog; i++) { string bytesString = Convert.ToString(bytes[i], 2).PadLeft(8, '0'); } } private void Update() { if (m_streamedDataList.Count > 0) { if (IsLocal == false) { byte[] firstBytesInList = m_streamedDataList[0]; if (firstBytesInList != null) { ApplyStreamData(firstBytesInList); } m_streamedDataList.RemoveAt(0); } } } ulong GetUserIdFromPhotonInstantiationData() { PhotonView photonView = GetComponent<PhotonView>(); object[] instantiationData = photonView.InstantiationData; Int64 data_as_int = (Int64)instantiationData[0]; return Convert.ToUInt64(data_as_int); } }2KViews0likes2CommentsSync Meta Avatar with Hand controller
Hello, I am using the OVR Player controller with OVR Hands, tracked by the device controller, not by the hand tracking system. As next step, I add the new meta avatar below the OVR Player controller (exactly below the TrackingSpace transform), keeping the OVR Hands logic, because I need to interact with other objects. However, the OVR Hand hand model is not synced (does not overlap) the meta avatar hand. Is there a way to fix, fine tune, that issue? I cannot use the hand tracking system, I need to use the device controller for tracking hand rotation and position, but replacing the controller model with hand one. Thanks in advanced for any help.1.8KViews1like3CommentsUsing Meta Avatar with Normcore
Hello, I'm currently trying tu use Normcore for a multiplayer game with Meta Avatar SDK. Is it possible to implement my Meta avatars with Eye and Face tracking with Normcore to a network (I'm using a Meta Quest Pro) ? Or should I use a different solution like Photon instead ?2.2KViews0likes2CommentsUnable to network meta avatars using Photon
I was able to successfully use the ApplyStreamData(bytes []) method locally to transfer body and lipsync tracking data from one avatar to another. Thererfore I thought it would be simple to repeat this process using Photon as a means to transfer the data to an avatar remotely. For some reason it's not working. I've followed the data being received and passed through ApplyStreamData(bytes []). After that all the right functions within the SDK seem to be getting called and I get no errors. However, the arms and lips of my avatar are still not moving! Has anyone else ran into this problem? For context I'm testing with an Oculus Quest 2 and the UnityEditor. Here is the receiving side of my code for reference: List<byte[]> m_streamedDataList = new List<byte[]>(); //Tracking Data bytes are recieved here [PunRPC] public void RecieveStreamData(byte [] bytes) { m_streamedDataList.Add(bytes); } //Runs through data list every frame applying streamed data private void Update() { if (m_streamedDataList.Count > 0) { if (IsLocal == false) { byte[] firstBytesInList = m_streamedDataList[0]; if (firstBytesInList != null) { ApplyStreamData(firstBytesInList); } m_streamedDataList.RemoveAt(0); } } }1KViews0likes0Comments