Meta Avatars 2 Lipsync - PUN 2 & Photon Voice issue!
Dear Devs, I'm struggling with problem since a week. I use PUN 2 and Photon Voice to bring Meta Avatars 2 in a multiplayer environment. Below are my issues, 1. When there is no Photon Voice setup in the scene, the Meta Avatars lipsync works perfect in the Photon multiplayer room. 2. When I add the Photon Voice to the prefab and setup the scene with Photon Voice Network, only the voice is there the Meta Avatars lipsync does not work. I understand there is a race condition happening between these two plugins. Please kindly help me resolve if anyone has already resolved such problem. This thread can help other devs as well in the future. Thanks!Solved12KViews1like29CommentsMeta Avatars SDK v29.0 - Avatar 2.0 Half body manifistation for 3rd person
Hello, We would like to upgrade our project to the latest Avatar SDK and use Avatars 2.0. Our current app design requires a half-body avatar in third-person view. However, this is not currently possible as the manifestation parameter no longer affects the appearance of the avatars. Switching to first-person view shows an avatar without legs and without a head, which is not suitable for third-person view. Is there a workaround to display the Avatar 2.0 with both a half-body and a head at the same time? Is there a plan to support this in Avatars 2.0? Thank you!514Views3likes0CommentsIntegrating Meta Avatars with Application Spacewarp
I am working on an application that requires the use of Application Spacewarp and Meta Avatars 2 SDK. In testing, I have determined that the shaders for the Meta Avatars are not creating Motion Vectors for Application Spacewarp to use to correctly render its faked frame. The rendering artifact that clued me into the issue was seeing the opaque Meta Avatars stuttering. According to the information from Meta's Application Spacewarp sample git repo (https://github.com/oculus-samples/unity-appspacewarp), this is a sign that the Meta Avatar shader is not generating Motion Vectors. Has anyone worked with Application Spacewarp and the Meta Avatar SDK2 as of yet who could advise me in how to address this issue? Any and all help would greatly be appreciated! I do have a possible initial lead in that in the Meta Avatar SDK2, there is a Recommended folder with a sub folder of "app_specific" which has 2 files; app-declarations and app_functions. These seem to be here for the purpose of adding app specific functionality to the Meta Avatar shader code. I am going to try and mess with them, but I don't have much experience in doing this kind of thing so again if the right people find this, any and all guidance on this point would also be appreciated!614Views0likes0CommentsMeta Avatar SDK Unity mac Support
Hello, I am building a multiplayer app on meta quest 3 and using meta avatar 2 to show the avatars. Does syncing avatar can also display on mac build? I want to make 2 different apps, one for quest and other running on mac. Does avatar syncing possible on both platforms ? Thanks!801Views0likes1CommentAdd animation to Avatar in Unity
I want to use avatars from Meta SDK as NPCs in an experience that I'm developing, using some Mixamo animations to make them look more alive. However I can't find a way to add an Animator controller to the OVRAvatarEntities. Has anyone done this?4.4KViews2likes6CommentsRemote Meta Avatar Loading with Invalid Access Token Locally
Hi, when a local user has an invalid access token and wants to load a remote avatar, that avatar is broken (head, arms are stuck). That thing should not happen because the remote user has an avatar and has a valid token as well. Therefore, the network info should be fine. However, I found that if local user has an invalid access token, the remote avatar is not loaded, it is replaced by the default one (broken). I found a workaround, just sending through the network properties the remote access token, but it's not a correct solution. Has anyone faced this issue? Thanks in advace for any help.1.5KViews0likes2CommentsLocal Meta Avatars T-posing in front of Camera
Hi so I've tried implementing networked avatars (using PUN) in a unity VR app but no matter what I try to do, my local avatar always ends up t-posing sideways without any animations in front of the main camera rig. Everything is as its supposed to be, entitlement checks working, deploying on alpha channel, adding test users. I ended up getting this unity asset to cross-reference and see what I was doing wrong, but even in their implementation, the local avatar is t-posing outside first-person view, same problem as mine https://assetstore.unity.com/packages/tools/network/meta-avatars-pun2-vr-multiplayer-template-211918 Any insight would be appreciated!Solved2.7KViews0likes3CommentsMeta Avatars Loading Black and White without any textures
I am trying to load Meta avatars as local and remote players in a Multiplayer environment established by photon. I am able to load the meta avatars as local and remote players in one room but my meta avatars are displayed white and black.(i.e with out any shaders or textures). I am trying to instantiate the meta avatar as player when player joins the room using the user id. Below is my code: using System.Collections; using System.Collections.Generic; using UnityEngine; using Oculus.Avatar2; using Oculus.Platform; using Photon.Pun; using System; public class RemotePlayer : OvrAvatarEntity { [SerializeField] int m_avatarToUseInZipFolder = 2; PhotonView m_photonView; List<byte[]> m_streamedDataList = new List<byte[]>(); int m_maxBytesToLog = 15; [SerializeField] ulong m_instantiationData; float m_cycleStartTime = 0; float m_intervalToSendData = 0.08f; protected override void Awake() { ConfigureAvatarEntity(); base.Awake(); } private void Start() { m_instantiationData = GetUserIdFromPhotonInstantiationData(); _userId = m_instantiationData; StartCoroutine(TryToLoadUser()); } void ConfigureAvatarEntity() { m_photonView = GetComponent<PhotonView>(); if (m_photonView.IsMine) { SetIsLocal(true); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Default; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "MyAvatar"; } else { SetIsLocal(false); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Remote; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "OtherAvatar"; } } IEnumerator TryToLoadUser() { var hasAvatarRequest = OvrAvatarManager.Instance.UserHasAvatarAsync(_userId); while (hasAvatarRequest.IsCompleted == false) { yield return null; } LoadUser(); } private void LateUpdate() { float elapsedTime = Time.time - m_cycleStartTime; if (elapsedTime > m_intervalToSendData) { RecordAndSendStreamDataIfMine(); m_cycleStartTime = Time.time; } } void RecordAndSendStreamDataIfMine() { if (m_photonView.IsMine) { byte[] bytes = RecordStreamData(activeStreamLod); m_photonView.RPC("RecieveStreamData", RpcTarget.Others, bytes); } } [PunRPC] public void RecieveStreamData(byte[] bytes) { m_streamedDataList.Add(bytes); } void LogFirstFewBytesOf(byte[] bytes) { for (int i = 0; i < m_maxBytesToLog; i++) { string bytesString = Convert.ToString(bytes[i], 2).PadLeft(8, '0'); } } private void Update() { if (m_streamedDataList.Count > 0) { if (IsLocal == false) { byte[] firstBytesInList = m_streamedDataList[0]; if (firstBytesInList != null) { ApplyStreamData(firstBytesInList); } m_streamedDataList.RemoveAt(0); } } } ulong GetUserIdFromPhotonInstantiationData() { PhotonView photonView = GetComponent<PhotonView>(); object[] instantiationData = photonView.InstantiationData; Int64 data_as_int = (Int64)instantiationData[0]; return Convert.ToUInt64(data_as_int); } }2KViews0likes2CommentsHow to connect a Wrist UI to a Meta Avatar?
I have a wrist UI right now connected to LeftHandAnchor. Unfortunately with this, I have to give fixed offsets and sizes. How can I connect this to the Meta Avatar in a way, that I can put it above the skin of the wrist, like a smart watch and even size it according to the wrist diameter?789Views0likes0Comments