Enabling Facial Expressions on Networked Avatars
Hi all! I'm looking to enable face tracking for a multiplayer game I'm currently developing where each avatar is network instantiated. The current documentation explains how to set it up for local avatars. Is there any documentation available for networked instantiated avatars?4.4KViews0likes6CommentsIs there an update on Quest Pro Tongue Tracking?
This Question was asked some time ago and a Community Manager answered with "We are looking into it". But I wanted to ask if there is any update on the tongue tracking for Quest Pro. So far I'm rather happy with the face tracking capabilities but one thing that's definitly missing is tongue tracking. It's one of the first things people ask about when they notice someone is using facetracking with a Quest Pro. And if it would be added, this headset would basically be perfect in my eyes. Is anyone else sharing this mentality?Solved2.7KViews6likes2CommentsAdditional Tongue Tracking BlendShapes
There are currently seven BlendShapes for tongue tracking which primarily only allow for inwards/outwards positioning. https://developer.oculus.com/documentation/unity/move-face-tracking/#face-blendshapes Some further implementation allowing for simple lateral left and right positioning would greatly improve the potential feature applications. Additional development could include both vertical and lateral manipulation, as well as morphing. This would be in similarity with competing HMD face tracking implementations.887Views2likes1CommentNetwork face tracking Meta avatars
I have working Unity app using Meta avatars. Lipsync works well - other network players see my face moving. So data is receiving correctly. But! When I introduced face tracking, it only works locally. Nobody see my facetracking except me. I'm using the pair of Sample avatar entities (local and remote) on my network player prefab. Everything works except sending facetracking. Can you help me?1.2KViews0likes1CommentFailed to Start Face Tracking Error Message
Hello! I am developing a project in Unity that utilizes the face tracking, hand tracking, and eye tracking features of the Meta Quest pro. I have followed the tutorial on Meta's developer website here. I imported the movement SDK and enabled the necessary features under the OVRCameraRig (Quest Features and Permission Requests on Startup). However, no matter what I do, the only thing that is working in the Aura Sample Project is hand tracking. The avatar's face and eyes do not move at all. In the Unity console, I get errors that say "failed to start face tracking" and "failed to start hand tracking." I am using Unity version 2022.3.13f1. I also pulled another sample project from Github that someone developed to test face and eye tracking and I am seeing the same issues there. I researched prior instances of this happening with other developers such as here and I followed the steps that helped them resolve the issue such as enabling the public beta channel of the Oculus app and ensuring the Quest Pro was fully up to date, however, no matter what I do, nothing seems to resolve the issue. I have also enabled the face tracking and eye tracking checkboxes under the beta features section of the Oculus PC app. Furthermore, face tracking and eye tracking appears to be working in the home environment of my Quest Pro as the avatar there reflects my face and eye movements. I would appreciate any and all help in resolving this issue. Thank you!Solved1.7KViews0likes1CommentFace Tracking - FaceExpressions Data
I use Unity 2021.3.20F1 and Oculus Integration V54. I can perform facial tracking and AURA with my face expression. Based on research purposes, I need to store information about facial expression weights. But when I store the information about OVRFACEEXPRESSIONS.TOARAY (), the game will be stuck. Is there any way to allow me to store the information and not stuck?838Views0likes0CommentsNetwork face tracking Meta avatars
I have working Unity app using Meta avatars. Lipsync works well - other network players see my face moving. So data is receiving correctly. But! When I introduced face tracking, it only works locally. Nobody see my facetracking except me. I'm using the pair of Sample avatar entities (local and remote) on my network player prefab. Everything works except sending facetracking. Can you help me?789Views0likes1CommentAbout quest pro, how to use face capture data in the real engine 5
Quest pro has the functions of eye expression capture and facial expression capture. But I didn't find out how to get this data in the unreal engine in the document How can I get this data and the corresponding expression for development in the unreal engine? I may need some help4.2KViews0likes3CommentsPassthrough Shadows Discussion
I'm currently looking into the best way to implement shadows for virtual characters. I've seen a few ways that look promising with just drop shadow cards. But what if I want true dynamic shadows? How do I get them to cast with no floor mesh since I want Passthrough? Or even dynamic soft shadows? I came across this for dynamic soft shadows based off the collision which looks pretty good. But how would I implement that with Passthrough? https://www.youtube.com/watch?v=_oed7fqqzaY Here is a sneak peak at the current project. The characters still don't feel grounded yet without shadows. Just trying to find the best and most realistic way. https://twitter.com/colin_a_brady/status/1627061040939941888?s=20 Please share your thoughts an how you think the best way to implement a shadow for so many characters should work in Passthrough!2.4KViews0likes2CommentsUE5.1 with new MetaXR Plugin not getting Face Tracking to work with Quest Pro
So I'm using the official MetaXR Plugin that just released and I'm using it with UE 5.1. I've enabled all the the Facial and Eye tracking in the Project/Plugin settings. When putting the face component in my Blueprint I'm not getting any Expression Values at all. Are there specific steps I need to take to get the engine to recognize the Headset? Playing in Editor works in VR, just not getting any Facial data coming through. And yes, I have Eye and Facial Tracking enabled on the headset as well. Please Help!2.1KViews0likes2Comments