Meta Quest Pro Eye Tracking Enabling Issue
I am working on an application that retrieves the eye position of a user by using collision of a surface with the OVRGazeRay. I was able to make it work and could retrieve the data points over the meta link to my computer. At some point I updated my Quest Pro and from then on it stopped working. The Movement SDK tells me that the eye tracking is enabled and working In addition to that the Meta XR project setup tool also tells me that there are no issues in my project. I have enabled eye tracking everywhere it needs to be and I did first send a build of the project to my quest to be able to enable eye tracking for the application in the meta quest pro and have verified that from then on it is enabled in the settings when using the meta link. In addition to the eye tracking using FixedUpdate (I doubled the frequency of the update) I also have an additional C# script taking controls from the controllers and interacting with a TextMeshProUGUI which is not used for collision detection (the two are hence separate). Everything seems to work just fine but somehow since the update I cannot retrieve the data anymore. Hence I asked the API directly if the eye tracking is enabled using (OVRPlugin.eyeTrackingEnabled) and this tells me that it is not enabled. This results from the following code if (OVRManager.isHmdPresent) { OVRPlugin.SystemHeadset headset = OVRPlugin.GetSystemHeadsetType(); UnityEngine.Debug.Log("Headset type: " + headset); bool eyeTrackingSupported = OVRPlugin.eyeTrackingSupported; bool eyeTrackingEnabled = OVRPlugin.eyeTrackingEnabled; UnityEngine.Debug.Log("Eye Tracking Supported: " + eyeTrackingSupported); UnityEngine.Debug.Log("Eye Tracking Enabled: " + eyeTrackingEnabled); if (eyeTrackingSupported && eyeTrackingEnabled) { UnityEngine.Debug.Log("Eye tracking is supported and enabled."); } else if(eyeTrackingSupported) { UnityEngine.Debug.LogWarning("Eye tracking is not enabled."); } else if (eyeTrackingEnabled) { UnityEngine.Debug.LogWarning("Eye tracking is not supported"); } else { UnityEngine.Debug.LogWarning("Eye tracking is not supported and not enabled."); } } Things I have already checked multiple times: - Developer Mode is enabled on the Meta App on my phone for the Meta Quest Pro - I did factory reset the meta quest pro multiple times - I am using 2022.3.35f1 LTS Unity which is compatible last time I checked with the movement sdk and the meta sdk. - I am using up-to-date versions on all my packages - Before the update I was able to retrieve the eye tracking data and I have verified that it matches the exact eye pattern. I also know that no collision is occuring as the eye gaze ray would change from white to red were it to collide with something. I have been trying to figure this out for weeks now and I really do not know what the problem is being caused by. I hope someone can help me. Thanks in advance.1.5KViews0likes3CommentsPassthrough camera data is not available on Quest 3 developing with Unity
Basically it is not possible to access the real time passthrough video from Quest 3 in Unity. Many devs among the internet think it is a crucial feature to be able to process to create incredible mixed reality apps with object recognition using machine learning. Nevertheless, it is not possible to access the real time video captured by quest 3 in unity. Here is a reddit post made by me where people are requesting this feature. Please, we need access to real time passthrough video in Unity or other engines.47KViews109likes109CommentsQuest Pro Durability
Next time a top-tier headset is made, i dont want to have to regret not serendipitously swapping out my current one with another headset a little over a year and 3/4ths later. the side usbc port is sunken in because there was nothing to brace it to the housing and no other device that i’ve ever used since 2020 has had that issue. no spills on the headset, just gentle plugging and unplugging, and falling asleep with it on by accident here and there. I’ve used my headset almost every day since i bought it in 2023 and im so sad that its my most recent purchase of an every day use item and the first thing to fail significantly. not sure what to suggest about the controllers other than please make it harder for them to get things like dead skin from getting inside of the controller and sell replaceable magnets for the sticks on the site but also make the controllers modular. please account for this in the next installment of a top-tier headset packages350Views0likes0CommentsAPI call to delete room set-up
We have had a lot of feedback from users about not enjoying how passthrough room set-up impacts their other app experiences. We are wondering if it is possible to provide users inside our app the ability to delete their room set-up without needing to go into oculus settings?937Views1like1CommentDeveloper Mode Quest Pro
Hey 🙂 I bought a Meta Quest Pro because I would like to develop some small projects for these glasses. I'm also very excited about the device, but I've already encountered a more or (I hope) less big problem pretty quickly and hope you can help me here. The issue is that I can't get the glasses properly into Developer Mode and thus connected to my PC or the app. I have tried many different ways, downloaded software updates, used different cables, worked with the app and dev hub, run power cycles and rebooted all devices, but I have not been able to solve the problem. I always get to the step where I have to allow debugging via USB on the headset. I remember this from the Quest, where it worked fine, but on the Pro this prompt just doesn't appear for me to give my permission. After some searching, I came across a thread here that also describes exactly this problem and where in the further course, quite current is written that there is a solution for it...can you help me find this solution? I have already looked around a bit in the forums as I said and ended up here: https://communityforums.atmeta.com/t5/Get-Help/Quest-Pro-Developer-Mode/m-p/994506 I had then written to the support chat, as well as the mod from the forum entry, both of which unfortunately could not help me directly, but forwarded me to the PM of the meta support and they forwarded me to the consumer support and this page... i would really appreciate if someone can help me... It also works really everything great, except the last step "allow access". When I connect the headset via cable to the pc, there is a nice sound, but no window to confirm. Thank you very much and with kind regards Karo1.2KViews0likes2CommentsCamera motion in passthrough
Hi! I am working on developing an app that rotates or translates the field of view during passthrough at different rates. I was wondering if there was a way to have the cameras on the Meta Quest 2, 3, or Pro track a virtual object that moves back and forth? I want the cameras to move back and forth independent from the user's head motion. I can clip some peripheral vision to make this possible as well. Does anybody have a recommendation on how to do this or think it is even possible with the current plugins? Thanks!480Views0likes0CommentsHand grab interaction fails if OVRCameraRig is child of high speed moving object
Hi all, I'm facing an unexpected scenario while using the grab feature provided by the Meta Interaction SDK. It looks like that if the OVRCameraRig object (with OVRHands and/or OVRControllerHands properly configured with HandGrabInteractors) is set as child of a "moving object" (together with the objects designed to be grabbed), depending on the speed of this object the grab actions fail. The scenario I'm working on is a train cabin moving along binaries designed as a spline path, with the VR user/player (represented by the OVRCameraRig object) parented to this cabin, since it must be inside the moving train. Inside the cabin I have a lever that must be grabbed to modify the speed of the train. At slow speeds the lever can be grabbed without problems, while increasing a little bit the speed, the lever can't be grabbed anymore! I tested it multiple times, trying to understand the root cause of the failure. I guess it's something related to the frequency of checks for the hands vs the lever intersections/collisions made by the Meta scripts. However I couldn't find any solution. Is that something I'm missing in the documentation? Maybe some script property to be set or fine tuned? NOTE: I don't think it would be acceptable, in a 3D application, to invert the scenario where the train is still and the whole environment is moving towards it.... Of course in this case both the objects to grab and the "player" would be still and the grabbing action should work like a charm. But I still believe it is an ugly workaround. Is someone having any clue on how to fix this "misbehaviour"? Thanks in advance!1.3KViews0likes2CommentsHow to start Eye Tracking Calibration from an Application or ADB
Is it possible to start the Quest Pro eye tracking calibration flow (the one in Settings -> Movement -> Eye Tracking) in either of these ways?: Within an application (whether it is an OpenXR call, android intent, or Unity plugin) Through ADB (intent, executable; can use root permissions) If this is not possible, then I'd like to make this a feature request, as it would greatly improve our user experience 🙂1.6KViews4likes2Comments