Showing results for 
Search instead for 
Did you mean: 

Questions regarding the Eye tracking data measured in Quest Pro

Honored Guest

Just started trying out the eye tracking in Quest Pro. As I'm quite new to it, hoping to clear some doubts about how eye tracking works in this headset.

1. Setting the reference frame in OVREyeGaze. From the manual, it says, "In order to drive the eye GameObject properly, this component requires a reference frame in world space orientation. Typically, this reference frame should be set in the forward direction of the eye. It is there to calculate the initial offset of the eye GameObject."

What does set in the forward direction mean? At the moment I've defined it as the left eye anchor, but I've tried a couple of other reference frames (like camera rig, and world space), and I can't see the much difference in the rotation data. 

2. I am interested to use the eye tracking to extract gaze locations, i.e. I have a moving ball that is always 1.5m away from the camera and I want to know if the user is tracking the ball as it moves.

So far what I've done is to first define eyeGaze:
OVREyeGaze eyeGaze; 

Then under Update(), write an if-statement to extract eye rotation when the ball is moving.

  if (eyeTrack) // when the ball is moving
         if (eyeGaze.EyeTrackingEnabled) 
                    rotX = eyeGaze.transform.rotation.x;
                    rotY = eyeGaze.transform.rotation.y;
                    rotZ = eyeGaze.transform.rotation.z;
                   [code to write to text file so that I can verify]


My problem is that getting the rotation doesn't tell me the x and y position of the eye gaze. And using eyeGaze.transform.position.x gives a very small value that seem to be intended for making realistic avatar eye movements rather than where the gaze position is. 

So, I tried to use transform.rotation.eulerAngles to see if I can use trigonometry to estimate the gaze location. Realized that the rotation values have very different patterns between quaternion and euler angles. First figure is the quaternion results from eyeGaze.transform.rotation.x which looks correct (judging from where I presented the ball) and the second is eyeGaze.transform.rotation.eulerAngles.x (which looks incorrect). 

rot results.jpgrot results (euler angles).jpg

Very confused with the quaternion to euler angle conversion, tried quaternion.eulerangles etc, but results still remain very different from the original quaternion results.

Any ideas? Perhaps all this conversion and trigonometry is not needed, and gaze location is already provided somewhere? 


Please advise.


I've only recently got my QuestPro so still new to this but from what I can see so far, the eye tracking API is very much focused on automating an avatar's eye movement rather than being used for gaze detection. Such a shame as I was hoping for gaze UI events being available and object detection (as you are doing). It's not to say it can't be done. 

If you look at Common Issues#4 (in the documentation you're probably looking at) you'll see that you can do a raycast from the eye transform:

"Can I use eye tracking to highlight certain areas of interest in my scene? Yes, you may do so via raycasting driven by the eye transform’s forward direction. Since the eye might change direction rapidly, it might be useful to apply filtering to the forward vector used for the raycast"

The last point is important as depending on how far your ball is from the player it might be hard to hit it with a single raycast. (From memory, when I last did this on Quest 2 using just a perpendicular raycast from the HMD, I used a boxcast to make hits more predictable/reliable)