Meta Avatars 2 Lipsync - PUN 2 & Photon Voice issue!
Dear Devs, I'm struggling with problem since a week. I use PUN 2 and Photon Voice to bring Meta Avatars 2 in a multiplayer environment. Below are my issues, 1. When there is no Photon Voice setup in the scene, the Meta Avatars lipsync works perfect in the Photon multiplayer room. 2. When I add the Photon Voice to the prefab and setup the scene with Photon Voice Network, only the voice is there the Meta Avatars lipsync does not work. I understand there is a race condition happening between these two plugins. Please kindly help me resolve if anyone has already resolved such problem. This thread can help other devs as well in the future. Thanks!Solved12KViews1like29CommentsUsing Oculus Lipsync in live mode (OVRLipSync Actor Component)
Hey there! I recently started looking at Oculus Lipsync for Unreal engine. I have downloaded the demo project with the plugin and got it working with 4.27. I have had a look around the demo project and tried to understand how it works. However I have noticed I can only get the "playback" demo to work and not the "live" demo. I have not touched anything in the demo project, the plugin works well with playback audio - I even managed to hook up my own character with no issues at all, I just cant get the live mode to work - which is ultimately what I would like to be using this for. Wondering if perhaps I need to specify a microphone or something in the config of the project but there's absolutely nothing in the documentation and I assumed it should work out of the box just like the playback demo. One last bit of information for anyone who may be able to help. In the documentation it states "When a prediction is ready, the OVRLipSync Actor component will trigger the On Visemes Ready event." I've hooked up a string and noticed this event never gets called no matter what I do in the live mode. I'd love some insight if anyone can help, Cheers!6.1KViews0likes5CommentsOVR Lip Sync with multiple audio sources in Unity
I am trying to set up a Ready Player Me avatar in Unity with Oculus lip sync to animate mouth movement. I have multiple audio sources for multiple audio clips because I want to be able to manually cycle through the audio clips one after another using the keyboard. However, 'OVR Lip Sync Context' seems to only work on one audio source whether I try to hold the audio sources on multiple game objects or the same one. I tried adding an 'OVR Lip Sync Context' script for each audio source but again only the first one will work (any audio sources after the first will play the audio with no animation). Does anybody know a way around this?2.1KViews0likes2CommentsWhat could be causing my canned lipsync to fail?
I'm struggling to get canned lipsync to work in Unity and would welcome any thoughts. I have a character model, with animator, audio source, etc. Its audio source has an OVRLipsyncContextMorphTarget, and the scene has an OVRLipSync helper object. If I add an OVRLipSyncContext component, it works fine. So the morph target etc is set up ok. The problem is that I need to use canned visemes for performance reasons. So, I... create a viseme file from an audio clip replace the OVRLipSyncContext component with OVRLipSyncContextCanned, and assign the viseme file from 1 to its currentSequence field. But when I play the audio clip, I get no lip movement at all. A couple of things I've tried: I've checked the viseme file: it's the right length and has viseme values which are large enough to produce visible movement. I've added logging to OVRLipSyncContextCanned to verify that it is indeed running and copying the frame info in the Update loop (it is). This all appears to be set up the same as the similar object in the Oculus LipSync demo scene. Can anyone think of any gotchas that I might have missed?833Views0likes0CommentsPrecomputed/Canned Lip Sync: "With Offline Model" option
When generating a precomputed viseme asset, in Unity, from an audio clip, there are two menu items under "Oculus/Lip Sync": "Generate Lip Sync Assets" "Generate Lip Sync Assets With Offline Model" The documentation only mentions the first of these. https://developer.oculus.com/documentation/unity/audio-ovrlipsync-precomputed-unity/ So, what does the "With Offline Model" part mean? When might (or should) one use it?758Views0likes0CommentsOculus Lipsync on Hololens
Is there a way to make Oculus Lipsync work on Hololens? When I build the demo project to Hololens I get the error: (DllNotFoundException: Unable to load DLL 'OVRLipSync': The specified module could not be found.). Is there any way to get around this?769Views1like0CommentsUnity Editor Crash Using OVRLipSync On MacOS
I'm hitting a strange issue that I can't explain. We're developing a VR app in Unity and I've recently added the OVRLipSync for mouth movements but I keep getting Unity Editor crashes every time I run our application, even in the sample scene included in the package - It's 100% reproduceable. I have noticed however if we create a whole new Unity project (same Unity Editor version) and import the OVRLipSync package then run the sample scene it works fine (obvious after allowing permissions to the bundle file via System Preferences). However the crashes are consistent in our own project we're trying to add OVRLipSync to, so something is definitely amiss in my project specifically. It's also worth noting that it is only crashing on OSX, not Windows - I only have one OSX device so it could be specific to my Macbook but I'm unable to confirm that. At first I thought maybe it was OSX not having permission to the OVRLipSync.bundle file but I've ruled this out as I've re-imported the bundle file and allowed permission via the System Preferences -> Allow Anyway option. If anyone is able to help it'd be a huge relief as this is causing us a lot of issues at the moment as we're having to add defines in our code to ensure nothing involving the lipsync is used when we're developing in OSX. Here's my Editor.log file: OvrLipSync Awake: Queried SampleRate: 44100 BufferSize: 512 UnityEngine.StackTraceUtility:ExtractStackTrace () (at /Users/bokken/buildslave/unity/build/Runtime/Export/Scripting/StackTrace.cs:37) UnityEngine.DebugLogHandler:LogFormat (UnityEngine.LogType,UnityEngine.Object,string,object[]) UnityEngine.Logger:Log (UnityEngine.LogType,object) UnityEngine.Debug:LogWarning (object) OVRLipSync:Initialize () (at Assets/Oculus/LipSync/Scripts/OVRLipSync.cs:265) OVRLipSync:Awake () (at Assets/Oculus/LipSync/Scripts/OVRLipSync.cs:217) (Filename: Assets/Oculus/LipSync/Scripts/OVRLipSync.cs Line: 265) Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle.dylib Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle.so Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle E1115 10:21:07.861620 399812096 init_intrinsics_check.cc:44] CPU feature avx2 is present on your machine, but the Caffe2 binary is not compiled with it. It means you may not get the full speed of your CPU. E1115 10:21:07.862260 399812096 init_intrinsics_check.cc:44] CPU feature fma is present on your machine, but the Caffe2 binary is not compiled with it. It means you may not get the full speed of your CPU. Loaded scene 'Temp/__Backupscenes/0.backup' Deserialize: 64.066 ms Integration: 493.917 ms Integration of assets: 0.970 ms Thread Wait Time: -0.236 ms Total Operation Time: 558.717 ms Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle.dylib Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle.so Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle.dylib Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle.so Fallback handler could not load library /Applications/Unity/Hub/Editor/2020.3.16f1/Unity.app/Contents/Frameworks/Mono/lib/libAssets/Oculus/LipSync/Plugins/MacOSX/OVRLipSync.bundle *** Aborted at 1668507668 (unix time) try "date -d @1668507668" if you are using GNU date *** PC: @ 0x193b3b4f3 (unknown) *** SIGSEGV (@0x0) received by PID 2995 (TID 0x117d4a600) stack trace: *** @ 0x7ff8074c2dfd _sigtramp1.3KViews0likes0CommentsDoes licence allow use on other platforms?
When using oculus developer tools, what is the licensing model when it comes to using the tools for parts of the game on other platforms (in a multi-platform title)? For example, the lipsync tools are not VR specific, and while they will be used on an Oculus title, they could also be very useful for the title on other platforms. Is the lipsync data generated from the Oculus tool allowed to be used on other platforms?762Views0likes0Comments