02-27-2025 12:09 AM
Hey all!
Problem:
I am inexperienced in Unity and XR development, but have the task to create a study that involves Eye and Head Tracking. I want to build a very simple circular UI, with 4-8 Buttons. Using the Meta Quest Pro I want to be able to move a cursor towards these buttons by either looking at them (Gaze) or by tilting the head that direction.
Once hovering over that button, I want to use a variety of selection methods (dwell, nodding, blinking etc.) and evaluate the time it takes between start and end of interaction. I see the complexity of the task somewhat simple once having the basics sorted out (Mainly on how to get the gaze and head coordinates into unity).
Approach / Current Status:
I am aware that there are a of packages that assist in fetching gaze and headrotation data from my headset. Meta Core SDK and Meta Interaction SDK. I am however very confused with these building blocks and would very much appreciate someone with more experience pointing me to the way to go. Is the OVRCameraRig the source to extract all necessary information? I always go stuck at a variety of errors not even sure I am going the right way, so I can not further specify my problems/errors as I was just searching in the dark.
At this point I am just trying to get head and gaze position into the debug log before going any deeper.
Thanks
Jonald
Did this answer your question? If it didn’t, use our search to find other topics or create your own and other members of the community will help out.
If you need an agent to help with your Meta device, please contact our store support team here.
Having trouble with a Facebook or Instagram account? The best place to go for help with those accounts is the Facebook Help Center or the Instagram Help Center. This community can't help with those accounts.
Check out some popular posts here:
Getting Help from the Meta Quest Community
Tips and Tricks: Charging your Meta Quest Headset