01-06-2022 01:07 AM - edited 01-06-2022 01:08 AM
I've done some rough set up to get an Avatar 2 Oculus avatar to work with Photon in a multiplayer game.
In my demo, I can see the remote player updating and hear the audio from their mic. However, the lips move significantly before the audio comes through the speakers.
My fix idea was to have the remote avatar lip sync to the audio when it receives it (from Photon Voice) rather than having the avatar send the lip sync data with the rest of the movement. However, the remote avatar seems to ignore any data from the OVRAvatarLipSyncContext. It's possible there's conflicting data between the lip sync context being generated from the audio and the remote tracking data.
This also seems to go explicitly against the docs:
https://developer.oculus.com/documentation/unity/meta-avatars-networking/
"The entity should not have any component references for Body Tracking or Lip Sync. This data will come from network packets."
Does anyone have any more insight on what's happening, a fix, or an alternative solution?
01-31-2022 12:16 PM
At least it seems you were able to network the data from one avatar to another in Photon. Could you share with me how you managed to do it because I've had no luck. Here's my code:
public class GGMetaAvatarEntity : OvrAvatarEntity
{
[SerializeField] int m_avatarToUseInZipFolder = 2;
PhotonView m_photonView;
byte[] m_streamedData;
private void Start()
{
m_photonView = GetComponent<PhotonView>();
if (m_photonView.IsMine)
{
SetIsLocal(true);
_creationInfo.features = CAPI.ovrAvatar2EntityFeatures.Preset_Default;
SampleInputManager sampleInputManager = OvrAvatarManager.Instance.gameObject.GetComponent<SampleInputManager>();
SetBodyTracking(sampleInputManager);
gameObject.name = "MyAvatar";
}
else
{
SetIsLocal(false);
_creationInfo.features = CAPI.ovrAvatar2EntityFeatures.Preset_Remote;
gameObject.name = "OtherAvatar";
}
if (IsLocal)
{
string[] zipPaths = new string[] { m_avatarToUseInZipFolder + "_rift.glb" };
LoadAssetsFromZipSource(zipPaths);
}
else
{
string[] zipPaths = new string[] { m_avatarToUseInZipFolder + 1 + "_rift.glb" };
LoadAssetsFromZipSource(zipPaths);
}
}
private void LateUpdate()
{
if (m_photonView.IsMine)
{
byte[] bytes = RecordStreamData(activeStreamLod);
RTDebug.Log(gameObject.name + "Sending streamed data with bytes " + bytes.Length);
m_photonView.RPC("SetStreamData", RpcTarget.Others, bytes);
}
}
[PunRPC]
public void SetStreamData(byte [] bytes)
{
RTDebug.Log(gameObject.name + "Recieving Streamed data with bytes " + bytes.Length);
m_streamedData = bytes;
ApplyStreamData(m_streamedData);
}
}
10-03-2022 12:02 PM
I got a solution and I've posted it here: https://forums.oculusvr.com/t5/Unity-VR-Development/Meta-Avatars-2-Lipsync-PUN-2-amp-Photon-Voice-is...
04-02-2024 09:42 PM
I was trying to do the same thing and found your post. Did you ever find a solution to locally animate the lipsync of the remote avatar?
I'm about ready to resign myself to just sending the lipsync animation across the network, but I fear it will be out of sync with the audio.