Meta Avatars 2 Lipsync - PUN 2 & Photon Voice issue!
Dear Devs, I'm struggling with problem since a week. I use PUN 2 and Photon Voice to bring Meta Avatars 2 in a multiplayer environment. Below are my issues, 1. When there is no Photon Voice setup in the scene, the Meta Avatars lipsync works perfect in the Photon multiplayer room. 2. When I add the Photon Voice to the prefab and setup the scene with Photon Voice Network, only the voice is there the Meta Avatars lipsync does not work. I understand there is a race condition happening between these two plugins. Please kindly help me resolve if anyone has already resolved such problem. This thread can help other devs as well in the future. Thanks!Solved12KViews1like29CommentsNetwork face tracking Meta avatars
I have working Unity app using Meta avatars. Lipsync works well - other network players see my face moving. So data is receiving correctly. But! When I introduced face tracking, it only works locally. Nobody see my facetracking except me. I'm using the pair of Sample avatar entities (local and remote) on my network player prefab. Everything works except sending facetracking. Can you help me?1.2KViews0likes1CommentFailed to retrieve Avatar Specification for user id (http response code: 400)
Hi, I know that this problem has been discussed in other posts but none of them had a solution. I'm trying to create a simple multiplayer app to test some interactions. I have the needed Rift and AppLab apps. There's a build on the alpha channel with some users added for access. When the people enter to the App, the local avatar spawns correctly, but the remote one doesn't. The errors that appear in the console are the following: [ovrAvatar2 native] stats::url file transfer failed: http response code: 400 uri: https://graph.oculus.com/7413158262106467?fields=id,avatar_v3%7Bmodel.profile(rift_v04b).platform(pc).sdk_version(Avatar2%20runtime%20SDK%2020.3.0.17.0%20client%20SDK%2020.3.0.17.0).client_version(0.1.0%2B2022.3.4f1).client_name(<project_name>){url,id,creation_time,animation_set}%7D error: No error error details: trace-id: CTJBp1WJMFP [ovrAvatar2 native] specification::Failed to retrieve Avatar Specification for user id <user_id> [ovrAvatar2 manager] UserHasAvatarAsync completed the request but the result was NotFound I suppose that the last 2 errors are consequences of the first one, but I can't make it work. The related posts that talks about this problem are the following: https://communityforums.atmeta.com/t5/General-Development/Meta-Avatar-load-fails-with-Failed-to-retrieve-Avatar/td-p/1029689 https://communityforums.atmeta.com/t5/Unity-VR-Development/UserHasAvatarAsync-completed-the-request-but-the-result-was/td-p/1111501752Views0likes0CommentsTracking position of meta avatar Right hand wrist manually
I need to update the transform of meta avatar Joint Right Hand Wrist based on another game object instead of automatically update. I played with wrist offset to make both transforms identical but its not working as expected. How can I update the transform of wrist as same as a particular object?565Views2likes0CommentsAdd animation to Avatar in Unity
I want to use avatars from Meta SDK as NPCs in an experience that I'm developing, using some Mixamo animations to make them look more alive. However I can't find a way to add an Animator controller to the OVRAvatarEntities. Has anyone done this?4.4KViews2likes6CommentsTrouble with Oculus Entitlement Check in Unity: Error on Device, Fine in Editor
Hey everyone, I've encountered an issue with my project and was hoping someone could assist me. I'm currently working on a project in Unity where I'm trying to load a personal avatar and pass the entitlement check. Initially, everything was running smoothly, but all of a sudden, I started encountering the 'oculus.platform.models.error' error. I took the step to update the Oculus app, and the issue seemed to resolve itself, as it started working again in the Unity editor. However, when I created the APK and tried it on my Oculus device, the same error popped up again. Strangely, this error now only occurs on my Oculus device, while it's still working fine within the editor. And this doesn't not happens at all to my friend. Here's the code snippet I'm using for the entitlement check. Could someone kindly assist me in debugging this?Thanks! public class UserEntitlement : MonoBehaviour { [SerializeField]private static ulong OculusID; [SerializeField]private Action OnEntitlementGranted; private void Awake() => EntitlementCheck(); private void EntitlementCheck() { try { Core.AsyncInitialize(); Entitlements.IsUserEntitledToApplication().OnComplete(IsUserEntitledToApplicationComplete); } catch (UnityException e) { Debug.LogError("Platform failed to initialize due to exception."); Debug.LogException(e); } } private void IsUserEntitledToApplicationComplete(Message message) { if (message.IsError) { Debug.LogError(message.GetError()); return; } Debug.Log("You are entitled to use this app."); Users.GetAccessToken().OnComplete(GetAccessTokenComplete); } private void GetAccessTokenComplete(Message<string> message) { if (message.IsError) { Debug.LogError(message.GetError()); return; } OvrAvatarEntitlement.SetAccessToken(message.Data); Users.GetLoggedInUser().OnComplete(GetLoggedInUserComplete); } private void GetLoggedInUserComplete(Message<User> message) { if (message.IsError) { Debug.LogError(message.GetError()); return; } OculusID = message.Data.ID; OnEntitlementGranted?.Invoke(); }1.8KViews0likes2CommentsMeta Avatars Loading Black and White without any textures
I am trying to load Meta avatars as local and remote players in a Multiplayer environment established by photon. I am able to load the meta avatars as local and remote players in one room but my meta avatars are displayed white and black.(i.e with out any shaders or textures). I am trying to instantiate the meta avatar as player when player joins the room using the user id. Below is my code: using System.Collections; using System.Collections.Generic; using UnityEngine; using Oculus.Avatar2; using Oculus.Platform; using Photon.Pun; using System; public class RemotePlayer : OvrAvatarEntity { [SerializeField] int m_avatarToUseInZipFolder = 2; PhotonView m_photonView; List<byte[]> m_streamedDataList = new List<byte[]>(); int m_maxBytesToLog = 15; [SerializeField] ulong m_instantiationData; float m_cycleStartTime = 0; float m_intervalToSendData = 0.08f; protected override void Awake() { ConfigureAvatarEntity(); base.Awake(); } private void Start() { m_instantiationData = GetUserIdFromPhotonInstantiationData(); _userId = m_instantiationData; StartCoroutine(TryToLoadUser()); } void ConfigureAvatarEntity() { m_photonView = GetComponent<PhotonView>(); if (m_photonView.IsMine) { SetIsLocal(true); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Default; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "MyAvatar"; } else { SetIsLocal(false); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Remote; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "OtherAvatar"; } } IEnumerator TryToLoadUser() { var hasAvatarRequest = OvrAvatarManager.Instance.UserHasAvatarAsync(_userId); while (hasAvatarRequest.IsCompleted == false) { yield return null; } LoadUser(); } private void LateUpdate() { float elapsedTime = Time.time - m_cycleStartTime; if (elapsedTime > m_intervalToSendData) { RecordAndSendStreamDataIfMine(); m_cycleStartTime = Time.time; } } void RecordAndSendStreamDataIfMine() { if (m_photonView.IsMine) { byte[] bytes = RecordStreamData(activeStreamLod); m_photonView.RPC("RecieveStreamData", RpcTarget.Others, bytes); } } [PunRPC] public void RecieveStreamData(byte[] bytes) { m_streamedDataList.Add(bytes); } void LogFirstFewBytesOf(byte[] bytes) { for (int i = 0; i < m_maxBytesToLog; i++) { string bytesString = Convert.ToString(bytes[i], 2).PadLeft(8, '0'); } } private void Update() { if (m_streamedDataList.Count > 0) { if (IsLocal == false) { byte[] firstBytesInList = m_streamedDataList[0]; if (firstBytesInList != null) { ApplyStreamData(firstBytesInList); } m_streamedDataList.RemoveAt(0); } } } ulong GetUserIdFromPhotonInstantiationData() { PhotonView photonView = GetComponent<PhotonView>(); object[] instantiationData = photonView.InstantiationData; Int64 data_as_int = (Int64)instantiationData[0]; return Convert.ToUInt64(data_as_int); } }2KViews0likes2CommentsMeta Avatar and Interaction SDK Hand Tracking Positioning Issue
Issue: While using both Meta Avatars and Hand Interaction SDK, my fingers don't align in certain positions. Pressing a button on a canvas with only the Meta Avatars hands visible is inaccurate and leads to misses or accidental button clicks. I understand I can pass custom data to the Interaction SDK hands but seems like a lot of trouble. Here are examples. Certain angles are worse than others. Certain angles seem to be aligned well.3KViews3likes4CommentsUsing avatar SDK without oculus desktop app
We are working on a mixed reality application where we want to use meta avatars, alongside that we are also working on a small desktop companion app where we would like to access those avatars. Our question is is it possible to access user's and other user's avatars on the desktop without having Oculus desktop app installed?762Views0likes0CommentsMeta Avatar Critical Joint Transforms go to zero position after calling LoadUser()
Our application depends upon attaching objects to each avatars hands and head. Unfortunately this means we can't use LoadUser() to update an avatar's appearance if they changed it because there seems to be a bug where the critical joint transforms just fall to the local zero position when LoadUser() is called. ot know where the users wrists and head are now Here are the steps to reproduce:1.7KViews2likes1Comment