08-24-2023 02:06 AM
Hi,
Just started trying out the eye tracking in Quest Pro. As I'm quite new to it, hoping to clear some doubts about how eye tracking works in this headset.
1. Setting the reference frame in OVREyeGaze. From the manual, it says, "In order to drive the eye GameObject properly, this component requires a reference frame in world space orientation. Typically, this reference frame should be set in the forward direction of the eye. It is there to calculate the initial offset of the eye GameObject."
What does set in the forward direction mean? At the moment I've defined it as the left eye anchor, but I've tried a couple of other reference frames (like camera rig, and world space), and I can't see the much difference in the rotation data.
2. I am interested to use the eye tracking to extract gaze locations, i.e. I have a moving ball that is always 1.5m away from the camera and I want to know if the user is tracking the ball as it moves.
So far what I've done is to first define eyeGaze:
OVREyeGaze eyeGaze;
Then under Update(), write an if-statement to extract eye rotation when the ball is moving.
My problem is that getting the rotation doesn't tell me the x and y position of the eye gaze. And using eyeGaze.transform.position.x gives a very small value that seem to be intended for making realistic avatar eye movements rather than where the gaze position is.
So, I tried to use transform.rotation.eulerAngles to see if I can use trigonometry to estimate the gaze location. Realized that the rotation values have very different patterns between quaternion and euler angles. First figure is the quaternion results from eyeGaze.transform.rotation.x which looks correct (judging from where I presented the ball) and the second is eyeGaze.transform.rotation.eulerAngles.x (which looks incorrect).
Very confused with the quaternion to euler angle conversion, tried quaternion.eulerangles etc, but results still remain very different from the original quaternion results.
Any ideas? Perhaps all this conversion and trigonometry is not needed, and gaze location is already provided somewhere?
Please advise.
08-24-2023 02:56 AM
I've only recently got my QuestPro so still new to this but from what I can see so far, the eye tracking API is very much focused on automating an avatar's eye movement rather than being used for gaze detection. Such a shame as I was hoping for gaze UI events being available and object detection (as you are doing). It's not to say it can't be done.
If you look at Common Issues#4 (in the documentation you're probably looking at) you'll see that you can do a raycast from the eye transform:
"Can I use eye tracking to highlight certain areas of interest in my scene? Yes, you may do so via raycasting driven by the eye transform’s forward direction. Since the eye might change direction rapidly, it might be useful to apply filtering to the forward vector used for the raycast"
The last point is important as depending on how far your ball is from the player it might be hard to hit it with a single raycast. (From memory, when I last did this on Quest 2 using just a perpendicular raycast from the HMD, I used a boxcast to make hits more predictable/reliable)
07-22-2024 03:04 AM
Hello,
do not know if It may help after a year but here how I get eye tracking information from quest pro.
First on Start or somewhere else called at startup you need to enable eye tracking.
var eyeTrackingStarted = OVRPlugin.StartEyeTracking();
The in the Update
if (!m_OculusEyeTrackingStarted || !OVRPlugin.GetEyeGazesState(OVRPlugin.Step.Render, -1, ref m_OculusEyeGazesState))
{
leftEyePosition = Vector3.zero;
leftEyeRotation = Quaternion.identity;
rightEyePosition = Vector3.zero;
rightEyeRotation = Quaternion.identity;
return;
}
// Left
var eyeGazeStateLeft = m_OculusEyeGazesState.EyeGazes[0];
// here you can check confidence if needed
// if (m_CheckConfidence && eyeGazeState.Confidence < m_ConfidenceThreshold)
// return;
var isValidLeft = eyeGazeStateLeft.IsValid;
var poseLeft = eyeGazeStateLeft.Pose.ToOVRPose();
// Right
var eyeGazeStateRight = m_OculusEyeGazesState.EyeGazes[1];
var isValidRight = eyeGazeStateRight.IsValid;
var poseRight = eyeGazeStateRight.Pose.ToOVRPose();
leftEyePosition = poseLeft.position;
leftEyeRotation = poseLeft.orientation;
rightEyePosition = poseRight.position;
rightEyeRotation = poseRight.orientation;
var cameraOffsetTransform = headCamera.transform.parent.transform;
var cameraOffsetRotation = cameraOffsetTransform.rotation;
var leftPositionFromCamera = cameraOffsetTransform.TransformPoint(leftEyePosition);
var rightPositionFromCamera = cameraOffsetTransform.TransformPoint(rightEyePosition);
var leftRotationFromCamera = cameraOffsetRotation * leftEyeRotation * Vector3.forward;
var rightRotationFromCamera = cameraOffsetRotation * rightEyeRotation * Vector3.forward;
var positionFromCamera = Vector3.Lerp(leftPositionFromCamera, rightPositionFromCamera, 0.5f);
var rotationFromCamera = Vector3.Lerp(leftRotationFromCamera, rightRotationFromCamera, 0.5f).normalized;
int hitCount = Physics.BoxCastNonAlloc(positionFromCamera, gazeHalfExtents, rotationFromCamera, _hits, Quaternion.identity, gazeDistance, gazeLayers, QueryTriggerInteraction.Ignore);
In the last line I generate a boxcast (you can use a raycast if you prefer) to get the object that I am looking at.
Cheers