Meta Avatar SDK Unity mac Support
Hello, I am building a multiplayer app on meta quest 3 and using meta avatar 2 to show the avatars. Does syncing avatar can also display on mac build? I want to make 2 different apps, one for quest and other running on mac. Does avatar syncing possible on both platforms ? Thanks!815Views0likes1CommentPassthrough doesn't work on client in multiplayer (made with Mirror in Unity)
Passthrough works fine on host player and shows him his room, but when connecting as a client the client's room is black. I am using Mirror in Unity 2021.2.7. What might be the source of the problem? Is it possible to connect to someone in multiplayer and use passthrough at all?1.3KViews0likes1CommentMeta Avatars Loading Black and White without any textures
I am trying to load Meta avatars as local and remote players in a Multiplayer environment established by photon. I am able to load the meta avatars as local and remote players in one room but my meta avatars are displayed white and black.(i.e with out any shaders or textures). I am trying to instantiate the meta avatar as player when player joins the room using the user id. Below is my code: using System.Collections; using System.Collections.Generic; using UnityEngine; using Oculus.Avatar2; using Oculus.Platform; using Photon.Pun; using System; public class RemotePlayer : OvrAvatarEntity { [SerializeField] int m_avatarToUseInZipFolder = 2; PhotonView m_photonView; List<byte[]> m_streamedDataList = new List<byte[]>(); int m_maxBytesToLog = 15; [SerializeField] ulong m_instantiationData; float m_cycleStartTime = 0; float m_intervalToSendData = 0.08f; protected override void Awake() { ConfigureAvatarEntity(); base.Awake(); } private void Start() { m_instantiationData = GetUserIdFromPhotonInstantiationData(); _userId = m_instantiationData; StartCoroutine(TryToLoadUser()); } void ConfigureAvatarEntity() { m_photonView = GetComponent<PhotonView>(); if (m_photonView.IsMine) { SetIsLocal(true); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Default; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "MyAvatar"; } else { SetIsLocal(false); _creationInfo.features = Oculus.Avatar2.CAPI.ovrAvatar2EntityFeatures.Preset_Remote; //setting body tracking input SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>(); SetBodyTracking(sampleInputManager); //setting lip sync input OvrAvatarLipSyncContext lipSyncInput = GameObject.FindObjectOfType<OvrAvatarLipSyncContext>(); SetLipSync(lipSyncInput); //setting face pose driver SampleFacePoseBehavior sampleFacePoseBehaviour = OvrAvatarManager.Instance.gameObject.GetComponent<SampleFacePoseBehavior>(); SetFacePoseProvider(sampleFacePoseBehaviour); //setting eye pose driver SampleEyePoseBehavior sampleEyePoseBehavior = OvrAvatarManager.Instance.gameObject.GetComponent<SampleEyePoseBehavior>(); SetEyePoseProvider(sampleEyePoseBehavior); gameObject.name = "OtherAvatar"; } } IEnumerator TryToLoadUser() { var hasAvatarRequest = OvrAvatarManager.Instance.UserHasAvatarAsync(_userId); while (hasAvatarRequest.IsCompleted == false) { yield return null; } LoadUser(); } private void LateUpdate() { float elapsedTime = Time.time - m_cycleStartTime; if (elapsedTime > m_intervalToSendData) { RecordAndSendStreamDataIfMine(); m_cycleStartTime = Time.time; } } void RecordAndSendStreamDataIfMine() { if (m_photonView.IsMine) { byte[] bytes = RecordStreamData(activeStreamLod); m_photonView.RPC("RecieveStreamData", RpcTarget.Others, bytes); } } [PunRPC] public void RecieveStreamData(byte[] bytes) { m_streamedDataList.Add(bytes); } void LogFirstFewBytesOf(byte[] bytes) { for (int i = 0; i < m_maxBytesToLog; i++) { string bytesString = Convert.ToString(bytes[i], 2).PadLeft(8, '0'); } } private void Update() { if (m_streamedDataList.Count > 0) { if (IsLocal == false) { byte[] firstBytesInList = m_streamedDataList[0]; if (firstBytesInList != null) { ApplyStreamData(firstBytesInList); } m_streamedDataList.RemoveAt(0); } } } ulong GetUserIdFromPhotonInstantiationData() { PhotonView photonView = GetComponent<PhotonView>(); object[] instantiationData = photonView.InstantiationData; Int64 data_as_int = (Int64)instantiationData[0]; return Convert.ToUInt64(data_as_int); } }2KViews0likes2CommentsVolume bar is shown each time you connect to an audio channel
Hi! Our team is working on a multiplayer app for Oculus Quest 2. The application has several rooms in which users communicate with each other (we use Agora Voice to connect users). Each room has its own audio channel. Connection to the audio channel occurs at the moment when the user enters the scene. After the latest Oculus Quest 2 update, a volume bar began to appear every time the user changed the audio channel. This greatly degrades the user experience. Tell me, please, is there any way to remove the volume bar when changing the audio channel? Will this problem be solved with the next update of the headset? I would be very grateful for your help! Unity version: 2021.3.16f11KViews1like1CommentUsing Meta Avatar with Normcore
Hello, I'm currently trying tu use Normcore for a multiplayer game with Meta Avatar SDK. Is it possible to implement my Meta avatars with Eye and Face tracking with Normcore to a network (I'm using a Meta Quest Pro) ? Or should I use a different solution like Photon instead ?2.2KViews0likes2CommentsInvite to app not working for some users & potentially breaking Oculus social platform
I am currently trying to implement the Invite to app / destinations / group presence features into a small test app for R&D purposes. At the moment, the app is able to set a user's group presence and display the group presence to players launching the app through an invite. For a couple of test users, this app has worked correctly - allowing them to set a presence and invite each other without issue. However, for me and one other user since using the app, we have experienced various issues with the Oculus social platform. For me - my Oculus account is no longer receiving invite notifications from any app that implements the Oculus invite system - not just my own test app. I have been unable to rectify this issue even with a full factory reset of the device. For the other user - their online status on the friends list seemed to be displayed as offline to other users despite them being online, with any invites or messages sent by them not appearing to other users. This was able to be rectified by them starting a party within the social window in the headset, which set their status back to online and fixed the issue. I am not sure whether a potentially incorrect implementation of the Invite system in my test app has caused these issues or if it's a coincidence and the issue lies with the Oculus api as a whole but in either case this seems like something fairly urgent that needs to be raised and addressed. If anyone has any knowledge or advice on these issues it would be greatly appreciated, I'm currently at a loss as to what's going on here.938Views2likes0CommentsHelp with integrating party travel to app
Hi! Developing an app that needs to have the party travel feature integrated. I know that Oculus Party works within Horizon, but there is another party feature once you logon to the Oculus and I need to know how to make that work with my app which is being developed in Unity. Does anyone have any experience?689Views0likes0CommentsData Streaming from Unity VR app to mobile (iOS/android/web) client
Hi there, I'm currently working on a VR app for a client, which will run on the Oculus Quest/Quest2. One of the base requirements is that the VR app can be interacted with from an external client application through two-way communication. The mobile client should be able to trigger certain events within the Quest app - for example, pause/continue/next scene/trigger sound, etc. The mobile client should also be able to receive data from the Quest app - things such as the user's performance metrics in a certain part of the experience. In my initial research, I've stumbled upon several options: UDP/threading/OSC/PUN/Mirror/Normcore/WebRTC, and I'm feeling slightly overwhelmed (I have no solid networking experience in the past). If possible I would like to avoid some of the heavier networking frameworks, as I want to avoid having to refactor all my code to accommodate for these. There isn't necessarily a need for full synchronization of game-states like there would on a real-time multiplayer game, only small custom data packets. Any guidance would be massively appreciated! I'm way in over my head at the moment. Thanks in advance. Pat12KViews3likes23Comments