Quest Pro EyeTracker sampling at higher rate.
Hi, is there a way to use the eyetracker at maximum sample rate? Our application is running around 30 fps but we want use maximum eye tracker frequency for our research analytics, is there a way to do it? Currently we use OVRPlugin.GetEyeGazesState() but that only gives us last value for the frame. We would like to get list of values sampled since last frame, or sample it separate thread at desired frequency. Any help is appreciated.40Views0likes0CommentsLaunching Eye-tracking Calibration remotely
Hi! I work for a research institute where we're planning on using Meta Quest Pro headsets to run a study involving eye-tracking. To get reliable eye-tracking data, we need to calibrate the device for each participant individually. Is there a way to do so seamlessly, without needing to take the headset off and putting it on again (to launch the calibration/study app). Is there a way to do either of these: a) Calibrate the eye-tracking within a (Unity) application itself? or b) Launch the calibration app remotely (like through an ADB command), and then launch the study app, whilst someone else (the participant) is wearing the headset? Thanks!2.9KViews5likes5CommentsQuest pro eye tracking vergence
Hi all. I finally got Eye tracking to work on my unity project (after updating my developer account on the phone app). Now I'm rendering the gaze rays from my eyes and testing eye independence and vergence. In all of my testings the vergence is nearly constant, regardless of whether I'm focusing on an object one inch or many feet away. However, gaze rays should be nearly parallel when focusing on very far away objects, and almost 90 degrees from each other when focusing on very close objects. That doesn't happen at all. The two rays are correct in that they both point in the direction my eyes were looking, but it seems Meta is not using each eye's data independently to determine the eye object rotation even though that's what the gaze script implies to be doing. My hypothesis is the algorithm uses something like a cyclops eye gaze calculation and then points both eyes towards a pre-established focal point on the cyclops ray. If that's the case then that's a problem because I'm using eye tracking precisely to discriminate between the user focusing on nearby or far away objects that may be aligned with the cyclops eye ref. I need the vergence angle for that, but it seems to be constant.3.8KViews4likes7Commentsfb_foveation extension
May I ask if anyone is able to make the fb_foveation work? As we are implementing the OpenXr and wanna use the eye gazing features, which seems to require the fb_foveation work. May I ask how I should access this or get the eye gaze location from Quest Pro?1.2KViews0likes0CommentsGetting Foveation center from OpenXR
May I ask if anyone has tried to get the foveated center from Meta Quest Pro? I am working on this, and I have tried XR_FB_eye_tracking_social. It seems like this feedback is only the pose. We would like to try the XR_Meta_eye_tracked, which seems to consist of the processed data. We want to implement it in the ALVR application, however, it seems like returning the default value (0,0). May I ask if there is any method to get the correct value, like enabling some developer option or what? Thank you1.4KViews0likes0CommentsOpenXR + Oculus SDK (Windows) ?
Hi, I have already developed a C++ app using Oculus SDK & OpenGL and would like to add eyetracking and handtracking to it (for Meta Quest Pro using Quest/Air Link). How could I process ? 1 - can I add eyetracking and handtracking extension using OpenXR without doing any rendering in it (keeping rendering done with Oculus SDK) ? 2- or must I convert the rendering process to do it via OpenXR ? I hope you can help me.1.3KViews1like0Comments