Using Oculus Lipsync in live mode (OVRLipSync Actor Component)
Hey there! I recently started looking at Oculus Lipsync for Unreal engine. I have downloaded the demo project with the plugin and got it working with 4.27. I have had a look around the demo project and tried to understand how it works. However I have noticed I can only get the "playback" demo to work and not the "live" demo. I have not touched anything in the demo project, the plugin works well with playback audio - I even managed to hook up my own character with no issues at all, I just cant get the live mode to work - which is ultimately what I would like to be using this for. Wondering if perhaps I need to specify a microphone or something in the config of the project but there's absolutely nothing in the documentation and I assumed it should work out of the box just like the playback demo. One last bit of information for anyone who may be able to help. In the documentation it states "When a prediction is ready, the OVRLipSync Actor component will trigger the On Visemes Ready event." I've hooked up a string and noticed this event never gets called no matter what I do in the live mode. I'd love some insight if anyone can help, Cheers!6.1KViews0likes5CommentsUOVRLipSyncActorComponent::FeedAudio
Does anyone in the world understand how this function works? I am currently working on a Virtual Assistant which is (eventually) going to Synchronise the Audio Output from Google's DialogFlow to the Mouth Movements of a Hyper-Realistic Model. I have successfully sent a ByteBuffer containing the packaged mono 16-bit signed integer PCM Values from a Python Script to UE4 via a TCP Socket, and now I am trying to use the FeedAudio function built in to the plugin to feed this Audio Response from DialogFlow through to my model. There is 0 documentation regarding this function, and I am completely unable to get it working myself. Am I supposed to Start the OVRLipSync Beforehand? When I do this, the lipsync still just seems to work Via my Microphone, rather than from the ByteBuffer I am feeding to it. Can anyone give me any advice or direction regarding this issue? I have been trying to contact the Oculus Support, but I have been met with no help, and even struggle to get a response from them.. Please please PLEASE someone help me out here, as this is the final piece of the puzzle which is this project I am carrying out. This is for my Dissertation for the final year of my Computer Science degree, so obviously want it to function as perfectly as possible. Someone Please Help Me!649Views0likes0CommentsOVRLipSync Playback Sequence only works once.
It may sound strange but I can make work the Canned Playback example only by generating a new Sequence. If I close the project and open it again it doesn't work any longer and must generate a new Sequence again to make it work. Could be something on my side or is it a bug that any other noticed?1.1KViews0likes2Comments