Meta Quest Pro: How to use Eye Tracking with Oculus Link in Unity
Hello! I have a Meta Quest Pro pro device and am interested in using the eye tracking feature over Oculus Link on a Windows host machine while developing in Unity. However while I'm able to get hand tracking working over Oculus Link - the eye tracking fails to initialize even with permissions enabled. This is a scene that works as expected when deploying an Android build to the device directly. It logs this when trying to start the app with the Oculus Link: "[OVREyeGaze] Failed to start eye tracking." When adding log statements in the project I see that the "OVRP_1_78_0.ovrp_StartEyeTracking()" function (and therefore "OVRPlugin.StartEyeTracking()") is returning a failure but I'm not sure how to see the error: if (!OVRPlugin.StartEyeTracking()) { Debug.LogWarning($"[{nameof(OVREyeGaze)}] Failed to start eye tracking."); return false; } I've seen other posts displaying beta eye tracking support in the Oculus app but I no longer see those options in the latest version of the app. Is it still possible to do eye tracking over Oculus link? Are there extra steps required to enable it? Here are the versions of my software and packages: - Unity Version - 2021.3.16 f1 - Oculus App Version - 53.0.0.98.132 - Windows Version - Windows 11 - Unity Oculus XR Package - 3.2.2 - Unity OpenXR Package - 1.5.3 Thank you!3.1KViews1like1CommentMeta Quest Pro Eye tracking - reading eye openness value
Hello, I'm interested to know if there is a way to retrieve the value of the eyes openness or at least to detect if the eyes are opened or closed is there documentation of the SDK where I can look for such functionality thanks, nir797Views0likes1CommentRegarding the Confidence property of OVREyeGaze
Hello, I am currently trying out eye tracking on the MetaQuest Pro. I want to exclude data when my eyes are closed, so I thought of using the Confidence property of OVREyeGaze, but the value doesn't change at all. I thought it would vary depending on the progress of the tracking, is that not the case? I've added Debug.Log to OVREyeGaze.The value in the log remained constant. private void Update() { if (!OVRPlugin.GetEyeGazesState(OVRPlugin.Step.Render, -1, ref _currentEyeGazesState)) return; var eyeGaze = _currentEyeGazesState.EyeGazes[(int)Eye]; if (!eyeGaze.IsValid) return; Confidence = eyeGaze.Confidence; // ConfidenceLog Debug.Log($"Eye Tracking Confidence: {Confidence}"); if (Confidence < ConfidenceThreshold) return; //~~~~~ }1.9KViews3likes2CommentsFailed to Start Face Tracking Error Message
Hello! I am developing a project in Unity that utilizes the face tracking, hand tracking, and eye tracking features of the Meta Quest pro. I have followed the tutorial on Meta's developer website here. I imported the movement SDK and enabled the necessary features under the OVRCameraRig (Quest Features and Permission Requests on Startup). However, no matter what I do, the only thing that is working in the Aura Sample Project is hand tracking. The avatar's face and eyes do not move at all. In the Unity console, I get errors that say "failed to start face tracking" and "failed to start hand tracking." I am using Unity version 2022.3.13f1. I also pulled another sample project from Github that someone developed to test face and eye tracking and I am seeing the same issues there. I researched prior instances of this happening with other developers such as here and I followed the steps that helped them resolve the issue such as enabling the public beta channel of the Oculus app and ensuring the Quest Pro was fully up to date, however, no matter what I do, nothing seems to resolve the issue. I have also enabled the face tracking and eye tracking checkboxes under the beta features section of the Oculus PC app. Furthermore, face tracking and eye tracking appears to be working in the home environment of my Quest Pro as the avatar there reflects my face and eye movements. I would appreciate any and all help in resolving this issue. Thank you!Solved1.7KViews0likes1CommentEye Tracked Foveated Rendering doesn't work with URP
I followed the prerequisites mentioned here https://developer.oculus.com/documentation/unity/unity-eye-tracked-foveated-rendering/ But I couldn't make the ETFR work with URP (it seems to work fine with legacy). When building the app I only get a black screen, no Unity logo, no app launch. Is URP not supported yet? I don't see any mention of this anywhere. Anyone else has the issue?2.3KViews1like2Comments