Errors After Adding the Meta Avatars SDK Sample Assets (v29.7.0) to Clean Project
I am getting the following errors when trying to add the new Meta Avatars SDK Sample Assets to a clean project using Unity 2022.3.22f1: Assets\Samples\Meta Avatars SDK\29.7.0\Sample Scenes\Scripts\UI\UILogger.cs(367,17): error CS0246: The type or namespace name 'UIInputControllerButton' could not be found (are you missing a using directive or an assembly reference?) Faild to find entry-points: Assets\Samples\Meta Avatars SDK\29.7.0\Sample Scenes\Scripts\UI\UILogger.cs(367,17): error CS0246: The type or namespace name 'UIInputControllerButton' could not be found (are you missing a using directive or an assembly reference?) What steps can I take to resolve errors?1KViews2likes4CommentsMeta Avatars SDK v29.0 - Avatar 2.0 Half body manifistation for 3rd person
Hello, We would like to upgrade our project to the latest Avatar SDK and use Avatars 2.0. Our current app design requires a half-body avatar in third-person view. However, this is not currently possible as the manifestation parameter no longer affects the appearance of the avatars. Switching to first-person view shows an avatar without legs and without a head, which is not suitable for third-person view. Is there a workaround to display the Avatar 2.0 with both a half-body and a head at the same time? Is there a plan to support this in Avatars 2.0? Thank you!511Views3likes0CommentsIssues Importing Meta Avatars SDK Samples v24.1.1 to Unity
Hi, How do I import the samples for Meta Avatars SDK v24.1.1? I'm trying to import the samples for Meta Avatars SDK v24.1 and All-in-on v62. However, the the Meta Avatar SDK Sample Assets folder in Packages and the Oculus folder in Assets only consist of some empty folder and zip files. No prefabs, no scenes or anything. The MetaAvatarsSDK menu also does not provide anything useful. I have followed the instructions on the website. I'm completely stuck on what the issue can be. I have tried on v24 as well. The Meta Avatars SDK folder contains scripts but the core assets are inside a zip folder. I have tried on multiple fresh unity versions, and multiple separate machines. Anyone with the same issue? Any help would be appreciated.2.1KViews0likes3CommentsIntegrating Meta Avatars with Application Spacewarp
I am working on an application that requires the use of Application Spacewarp and Meta Avatars 2 SDK. In testing, I have determined that the shaders for the Meta Avatars are not creating Motion Vectors for Application Spacewarp to use to correctly render its faked frame. The rendering artifact that clued me into the issue was seeing the opaque Meta Avatars stuttering. According to the information from Meta's Application Spacewarp sample git repo (https://github.com/oculus-samples/unity-appspacewarp), this is a sign that the Meta Avatar shader is not generating Motion Vectors. Has anyone worked with Application Spacewarp and the Meta Avatar SDK2 as of yet who could advise me in how to address this issue? Any and all help would greatly be appreciated! I do have a possible initial lead in that in the Meta Avatar SDK2, there is a Recommended folder with a sub folder of "app_specific" which has 2 files; app-declarations and app_functions. These seem to be here for the purpose of adding app specific functionality to the Meta Avatar shader code. I am going to try and mess with them, but I don't have much experience in doing this kind of thing so again if the right people find this, any and all guidance on this point would also be appreciated!606Views0likes0CommentsFailed to retrieve Avatar Specification for user id (http response code: 400)
Hi, I know that this problem has been discussed in other posts but none of them had a solution. I'm trying to create a simple multiplayer app to test some interactions. I have the needed Rift and AppLab apps. There's a build on the alpha channel with some users added for access. When the people enter to the App, the local avatar spawns correctly, but the remote one doesn't. The errors that appear in the console are the following: [ovrAvatar2 native] stats::url file transfer failed: http response code: 400 uri: https://graph.oculus.com/7413158262106467?fields=id,avatar_v3%7Bmodel.profile(rift_v04b).platform(pc).sdk_version(Avatar2%20runtime%20SDK%2020.3.0.17.0%20client%20SDK%2020.3.0.17.0).client_version(0.1.0%2B2022.3.4f1).client_name(<project_name>){url,id,creation_time,animation_set}%7D error: No error error details: trace-id: CTJBp1WJMFP [ovrAvatar2 native] specification::Failed to retrieve Avatar Specification for user id <user_id> [ovrAvatar2 manager] UserHasAvatarAsync completed the request but the result was NotFound I suppose that the last 2 errors are consequences of the first one, but I can't make it work. The related posts that talks about this problem are the following: https://communityforums.atmeta.com/t5/General-Development/Meta-Avatar-load-fails-with-Failed-to-retrieve-Avatar/td-p/1029689 https://communityforums.atmeta.com/t5/Unity-VR-Development/UserHasAvatarAsync-completed-the-request-but-the-result-was/td-p/1111501734Views0likes0CommentsMeta Avatars Loading Black and White without any textures
I am trying to load Meta avatars as local and remote players in a Multiplayer environment established by photon. I am able to load the meta avatars as local and remote players in one room but my meta avatars are displayed white and black.(i.e with out any shaders or textures). I am trying to instantiate the meta avatar as player when player joins the room using the user id. Below is my code: using System.Collections; using System.Collections.Generic; using UnityEngine; using Oculus.Avatar2; using Oculus.Platform; using Photon.Pun; using System; public class RemotePlayer : OvrAvatarEntity { [SerializeField] int m_avatarToUseInZipFolder = 2; PhotonView m_photonView; List<byte[]> m_streamedDataList = new List<byte[]>(); int m_maxBytesToLog = 15; [SerializeField] ulong m_instantiationData; float m_cycleStartTime = 0; float m_intervalToSendData = 0.08f; protected override void Awake() { ConfigureAvatarEntity(); base.Awake(); } private void Start() { m_instantiationData = GetUserIdFromPhotonInstantiationData(); _userId = m_instantiationData; StartCoroutine(TryToLoadUser()); } void ConfigureAvatarEntity() { m_photonView = GetComponent<PhotonView>(); if (m_photonView.IsMine) { SetIsLocal(true); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Default; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "MyAvatar"; } else { SetIsLocal(false); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Remote; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "OtherAvatar"; } } IEnumerator TryToLoadUser() { var hasAvatarRequest = OvrAvatarManager.Instance.UserHasAvatarAsync(_userId); while (hasAvatarRequest.IsCompleted == false) { yield return null; } LoadUser(); } private void LateUpdate() { float elapsedTime = Time.time - m_cycleStartTime; if (elapsedTime > m_intervalToSendData) { RecordAndSendStreamDataIfMine(); m_cycleStartTime = Time.time; } } void RecordAndSendStreamDataIfMine() { if (m_photonView.IsMine) { byte[] bytes = RecordStreamData(activeStreamLod); m_photonView.RPC("RecieveStreamData", RpcTarget.Others, bytes); } } [PunRPC] public void RecieveStreamData(byte[] bytes) { m_streamedDataList.Add(bytes); } void LogFirstFewBytesOf(byte[] bytes) { for (int i = 0; i < m_maxBytesToLog; i++) { string bytesString = Convert.ToString(bytes[i], 2).PadLeft(8, '0'); } } private void Update() { if (m_streamedDataList.Count > 0) { if (IsLocal == false) { byte[] firstBytesInList = m_streamedDataList[0]; if (firstBytesInList != null) { ApplyStreamData(firstBytesInList); } m_streamedDataList.RemoveAt(0); } } } ulong GetUserIdFromPhotonInstantiationData() { PhotonView photonView = GetComponent<PhotonView>(); object[] instantiationData = photonView.InstantiationData; Int64 data_as_int = (Int64)instantiationData[0]; return Convert.ToUInt64(data_as_int); } }2KViews0likes2CommentsMeta Avatars are nearly useless, and there is no support or documentation.
My company has been trying to implement the Meta Avatars into a game we are developing. As many VR games do, this involves holding many different items in VR with different hand poses. The only example of a custom hand pose, is primitive at best, with the custom pose being 24 transforms with a Skeleton Renderer script. There is no way to model poses accurately, and you can't even see what it looks like with a skin, until you run it. Move a transform the wrong way, and your fingers are a tangled up mess. Meta has spent literally billions of dollars on various software and game development for the Quest, including the Avatar SDK. It absolutely baffles me that there is no available support for it. No good examples, the documentation is extremely lacking. At least give me a hand model or something that can be used to model a hand, other than a pile of bones (vectors). The oculus integration hands look similar, but models are not compatible. Please, someone tell me I am wrong, and I have just been looking in all the wrong places. Tell me there is some actual documentation besides https://developer.oculus.com/documentation/unity/meta-avatars-custom-hand-poses/ Those documents sound like you can just add some joint types to any old hand model and they will work fine.. please show me one that will, because none have worked for us. Every time I search for solutions, all I can find is other people having the same issues. I am about to give up on even using Meta Avatars, because if we can't make the game feel good with them, I'd rather use another avatar solution, or even no avatar at all.2.3KViews9likes1CommentHow to connect a Wrist UI to a Meta Avatar?
I have a wrist UI right now connected to LeftHandAnchor. Unfortunately with this, I have to give fixed offsets and sizes. How can I connect this to the Meta Avatar in a way, that I can put it above the skin of the wrist, like a smart watch and even size it according to the wrist diameter?787Views0likes0Comments"Unknown error" - Meta Avatars not loading in unity editor
Hey, Suddenly, after we already had our meta avatars working properly we get some weird unknown error related to displaying our meta-avatar. In function: CoreAsyncInitialize() - we don't get ant response, and when the callback - "onComplete" is done, the data is also unknown. Has anybody had some problems displaying the Meta Avatars lately ? It's doesn't work in our the unity editor. Any answer would be appreciate, Thank you Mes1.2KViews0likes1Comment