meta avatar movements not updating to client/client conection
Hi, I'm making a local multiplayer app for meta quest 3 and I'm using meta avatars. The movements of the avatars are doing well on the conection host/client. But if I'm with 3 players in the room, the client players are able to see the movements of the host avatar, but the other client avatar stay fixed in the "T" position with the open arms and the movements dont update. Any solution?399Views0likes0CommentsMeta Avatar load fails with 'Failed to retrieve Avatar Specification' when using Federated User Ids
As of the 15th February 2023 Meta avatars being loaded using Federated UserIds as specified in the Cross-Play section of the documentations will not load. I get the following errors: [ovrAvatar2 native] stats::url file transfer failed: http response code: 500 [ovrAvatar2 native] specification::Failed to retrieve Avatar Specification for user id 535713845439867 This was working fine just yesterday so I assume something has changed in the backend to cause this problem. This isn't the first time we've had problems using the Federated User Id system and its making it hard to justify keeping Meta Avatars in our product.Solved5.7KViews1like6CommentsMeta Avatar Legs no moving like in Quest Home Page
I want the legs to move like it does in the Quest Home Page. It automatically shuffles and takes small steps depending how the user moves his head. I am using Unity and have been able to spawn the legs but they are stiff like wooden logs.781Views1like1CommentAdd animation to Avatar in Unity
I want to use avatars from Meta SDK as NPCs in an experience that I'm developing, using some Mixamo animations to make them look more alive. However I can't find a way to add an Animator controller to the OVRAvatarEntities. Has anyone done this?4.4KViews2likes6CommentsMeta Avatar Joint Transforms Are In Incorrect Position After Calling LoadUser() Again.
I think I've found a bug in the meta avatars sdk version 17.2. After calling LoadUser() on an avatar entity that has already been loaded before(which may happen if you'd want to update an avatar's look), the transforms Joint Head, Joint LeftHandWrist and Joint RightHandWrist stop being in the correct positions and simply fix themselves at 0, 0, 0. Here are the steps to reproduce it: In a blank scene add a AvatarSdkManagerHorizon object and a empty gameobject with a SampleAvatarEntity component. Set the SampleAvatarEntity's BodyTracking input to be the AvatarSdkManagerHorizon's SampleInputManager. Add some code in the SampleAvatarEntity which will enable you to call LoadUser() in runtime. Ensure you have UseStandalonePlatform checked in your OculusPlatformSettings so that your own avatar loads. Connect your headset with Quest Link and run the scene to let your avatar load. In the hierachy see how the Joint Head is in the correct place. Now manually call LoadUser() and see how Joint Head is no longer in the correct place.1.5KViews3likes1CommentHelp with Meta avatar playback animation (Unity2022.3.12, Oculus V57, URP)
So i am using the script from the loop back sample provided by meta, i have achieved record the animation and playback, but the animation is quite desynced with the audio that was recorded at the same time. For animation, i just save a list of processed PacketData, then reinsert them back to the avatar that i want to playback, releasing it at the end. For Audio recording, i just use the default unity recording. Everything comes out, animation played well and audio too, but as the time pass, maybe after 10 second, the animation becomes no long align with the audio that was recorded at the same time. Seems animation plays a bit slow. Is there any way to fix this? feels very sad since already figured the way to make it work(may still not be the right way though)922Views0likes1CommentMeta Avatars Loading Black and White without any textures
I am trying to load Meta avatars as local and remote players in a Multiplayer environment established by photon. I am able to load the meta avatars as local and remote players in one room but my meta avatars are displayed white and black.(i.e with out any shaders or textures). I am trying to instantiate the meta avatar as player when player joins the room using the user id. Below is my code: using System.Collections; using System.Collections.Generic; using UnityEngine; using Oculus.Avatar2; using Oculus.Platform; using Photon.Pun; using System; public class RemotePlayer : OvrAvatarEntity { [SerializeField] int m_avatarToUseInZipFolder = 2; PhotonView m_photonView; List<byte[]> m_streamedDataList = new List<byte[]>(); int m_maxBytesToLog = 15; [SerializeField] ulong m_instantiationData; float m_cycleStartTime = 0; float m_intervalToSendData = 0.08f; protected override void Awake() { ConfigureAvatarEntity(); base.Awake(); } private void Start() { m_instantiationData = GetUserIdFromPhotonInstantiationData(); _userId = m_instantiationData; StartCoroutine(TryToLoadUser()); } void ConfigureAvatarEntity() { m_photonView = GetComponent<PhotonView>(); if (m_photonView.IsMine) { SetIsLocal(true); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Default; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "MyAvatar"; } else { SetIsLocal(false); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Remote; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "OtherAvatar"; } } IEnumerator TryToLoadUser() { var hasAvatarRequest = OvrAvatarManager.Instance.UserHasAvatarAsync(_userId); while (hasAvatarRequest.IsCompleted == false) { yield return null; } LoadUser(); } private void LateUpdate() { float elapsedTime = Time.time - m_cycleStartTime; if (elapsedTime > m_intervalToSendData) { RecordAndSendStreamDataIfMine(); m_cycleStartTime = Time.time; } } void RecordAndSendStreamDataIfMine() { if (m_photonView.IsMine) { byte[] bytes = RecordStreamData(activeStreamLod); m_photonView.RPC("RecieveStreamData", RpcTarget.Others, bytes); } } [PunRPC] public void RecieveStreamData(byte[] bytes) { m_streamedDataList.Add(bytes); } void LogFirstFewBytesOf(byte[] bytes) { for (int i = 0; i < m_maxBytesToLog; i++) { string bytesString = Convert.ToString(bytes[i], 2).PadLeft(8, '0'); } } private void Update() { if (m_streamedDataList.Count > 0) { if (IsLocal == false) { byte[] firstBytesInList = m_streamedDataList[0]; if (firstBytesInList != null) { ApplyStreamData(firstBytesInList); } m_streamedDataList.RemoveAt(0); } } } ulong GetUserIdFromPhotonInstantiationData() { PhotonView photonView = GetComponent<PhotonView>(); object[] instantiationData = photonView.InstantiationData; Int64 data_as_int = (Int64)instantiationData[0]; return Convert.ToUInt64(data_as_int); } }2KViews0likes2CommentsSync Meta Avatar with Hand controller
Hello, I am using the OVR Player controller with OVR Hands, tracked by the device controller, not by the hand tracking system. As next step, I add the new meta avatar below the OVR Player controller (exactly below the TrackingSpace transform), keeping the OVR Hands logic, because I need to interact with other objects. However, the OVR Hand hand model is not synced (does not overlap) the meta avatar hand. Is there a way to fix, fine tune, that issue? I cannot use the hand tracking system, I need to use the device controller for tracking hand rotation and position, but replacing the controller model with hand one. Thanks in advanced for any help.1.8KViews1like3CommentsUsing Meta Avatar with Normcore
Hello, I'm currently trying tu use Normcore for a multiplayer game with Meta Avatar SDK. Is it possible to implement my Meta avatars with Eye and Face tracking with Normcore to a network (I'm using a Meta Quest Pro) ? Or should I use a different solution like Photon instead ?2.1KViews0likes2Comments