Using Oculus Lipsync in live mode (OVRLipSync Actor Component)
Hey there! I recently started looking at Oculus Lipsync for Unreal engine. I have downloaded the demo project with the plugin and got it working with 4.27. I have had a look around the demo project and tried to understand how it works. However I have noticed I can only get the "playback" demo to work and not the "live" demo. I have not touched anything in the demo project, the plugin works well with playback audio - I even managed to hook up my own character with no issues at all, I just cant get the live mode to work - which is ultimately what I would like to be using this for. Wondering if perhaps I need to specify a microphone or something in the config of the project but there's absolutely nothing in the documentation and I assumed it should work out of the box just like the playback demo. One last bit of information for anyone who may be able to help. In the documentation it states "When a prediction is ready, the OVRLipSync Actor component will trigger the On Visemes Ready event." I've hooked up a string and noticed this event never gets called no matter what I do in the live mode. I'd love some insight if anyone can help, Cheers!6.1KViews0likes5CommentsExport Lipsync Blendshape Animation out of UE4?
Hi, I tried to export a Oculus Lipsync animation from UE4.24. With the sequence recorder it seems that blendshapes are not supported. I want to use the animation later in Houdini, so any data format would be fine. How can I export lipsync data from UE4? Any help would be much appreciated!1KViews0likes2CommentsLipsync From Other Means
Hello there, I am new to the forum, and working on a project currently for my Final Year Project in college. I am trying to build a virtual assistant which maps the output from a dialog system to the mouth movements of a model I have designed. Currently, I have the model performing lipsync via my microphone input to UE4. My question is whether or not it is possible to get input to the Lipsync Via a different means. For example, I am wondering if I can stream a .wav file into Unreal Engine and use this as the voice input to the lipsync in real time. Or, is it possible to read an audio file as it is written to the disk, and use this as the input to the lipsync. I am trying to figure out some way of passing in the Audio Data from potentially a network source, but definitely a different source than UE4. Is what I am trying to do possible? Please let me know what you think, as this is the final piece of the puzzle that is this project. Thank You in advance!1.6KViews0likes3CommentsUOVRLipSyncActorComponent::FeedAudio
Does anyone in the world understand how this function works? I am currently working on a Virtual Assistant which is (eventually) going to Synchronise the Audio Output from Google's DialogFlow to the Mouth Movements of a Hyper-Realistic Model. I have successfully sent a ByteBuffer containing the packaged mono 16-bit signed integer PCM Values from a Python Script to UE4 via a TCP Socket, and now I am trying to use the FeedAudio function built in to the plugin to feed this Audio Response from DialogFlow through to my model. There is 0 documentation regarding this function, and I am completely unable to get it working myself. Am I supposed to Start the OVRLipSync Beforehand? When I do this, the lipsync still just seems to work Via my Microphone, rather than from the ByteBuffer I am feeding to it. Can anyone give me any advice or direction regarding this issue? I have been trying to contact the Oculus Support, but I have been met with no help, and even struggle to get a response from them.. Please please PLEASE someone help me out here, as this is the final piece of the puzzle which is this project I am carrying out. This is for my Dissertation for the final year of my Computer Science degree, so obviously want it to function as perfectly as possible. Someone Please Help Me!652Views0likes0CommentsOVRLipSync Playback Sequence only works once.
It may sound strange but I can make work the Canned Playback example only by generating a new Sequence. If I close the project and open it again it doesn't work any longer and must generate a new Sequence again to make it work. Could be something on my side or is it a bug that any other noticed?1.1KViews0likes2Comments