Forum Discussion
Fladeboe
7 years agoHonored Guest
OVR Lip Sync Replication
I'm really enjoying using the OVR lip sync system. Its amazing. But I'm confused as it seems to be made for social presence, yet I can not get it to work correctly in multiplayer. I have scoured the internet trying to find a template or tutorial on how to get it working with multiplayer but I can't find anything. This leads me to believe it must work in multiplayer out of the box, but it isn't for me.
My specific problem: Online game with a server and 1 client that use VOIP and OVR lip sync component. I'm using the default head that comes with the download, as well as their setup BPs copy and pasted in my VR pawn. VOIP works fine. But when one talks the other character's lips move as well. And the facial morphs don't seem to be replicated. So the client won't see the server when they talk and viceversa.
I'm using UE4.22. I've tried a whole bunch of things, with no results. The above example is what happens when I plug in their model and BPs with no changes. Any one get it to work with multiplayer yet? Do I need to create my own morph replication system?
My specific problem: Online game with a server and 1 client that use VOIP and OVR lip sync component. I'm using the default head that comes with the download, as well as their setup BPs copy and pasted in my VR pawn. VOIP works fine. But when one talks the other character's lips move as well. And the facial morphs don't seem to be replicated. So the client won't see the server when they talk and viceversa.
I'm using UE4.22. I've tried a whole bunch of things, with no results. The above example is what happens when I plug in their model and BPs with no changes. Any one get it to work with multiplayer yet? Do I need to create my own morph replication system?
2 Replies
Replies have been turned off for this discussion
- FladeboeHonored GuestI got it working by passing the morph targets to the server then clients. I'm still not sure this is correct way to do it. It seems expensive, but replication is not set to reliable. It seems unstable from the server side, as I've had a couple crashes (but that could be entirely something else). I will update this if I discover anything further. First BP is the set up. Only Start the lip sync from a locally controlled. 2nd BP is how I handled passing morph values to everyone else. Cheers
Edit: crash is definitely caused by this method. Working on a fix...
Edit: The crash is caused by using the OVR component in my pawn. When placed in the world seperate from the pawn there doesn't seem to be a problem. But when attached to the pawn there is (memory allocation errors, something about Opus Repacketizer?). I am using the VR expansion plugin pawn. I will continue to work on it and update this thread in case anyone has a similar issue. - FladeboeHonored GuestUpdate 3: I can get it to replicate from server to client, but whenever I send the values from client to server, it crashes. I'm still confused why there is no multiplayer set up information for a plugin that is so obliviously made for multiplayer. Or is it only supposed to be used with the Oculus Avatars?
I'm giving up on getting it to work multiplayer, but it is a fine solution for NPCs, pretty amazing actually. Below is a quick solution for some kind of mouth movement in multiplayer. This uses the head mesh included in the plugin example using only the laughter morph target. There is no OVR lip sync component, just an audio capture component attached to the pawn. This way is actually probably way better for performance anyway. Hope it helps someone.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 8 months ago
- 3 years ago