Passthrough camera data is not available on Quest 3 developing with Unity
Basically it is not possible to access the real time passthrough video from Quest 3 in Unity. Many devs among the internet think it is a crucial feature to be able to process to create incredible mixed reality apps with object recognition using machine learning. Nevertheless, it is not possible to access the real time video captured by quest 3 in unity. Here is a reddit post made by me where people are requesting this feature. Please, we need access to real time passthrough video in Unity or other engines.49KViews110likes109CommentsWith Quest Pro Dropping in Price, Can Meta Prioritize Supporting the D-Link Air Bridge?
Meta promotes the D-Link Air Bridge as a perfect PC to Quest solution but fails to support the newer Quest Pro headset. The community would appreciate some guidance as to when Meta anticipates support as neither D-Link nor Meta Customer Support provide transparency.6.6KViews28likes12Comments[solved] AR Passthrough - Bouncing Ball Template - Dropped Balls Not Reacting to my Scene Setup
Expected = Squeeze the right hand trigger to create and release a ball. Notice that the ball bounces against your defined surfaces. Problem - I've completed my room setup on a Quest Pro and followed instruction for the Bouncing Ball Scene Sample (linked below), and it looks visually correct, however the balls fall through my floor and scene recognized objects when I expect them to stop the balls from falling by creating 2d planes and 3D volumes which the ball would collide with. Can anyone help me understand where I'm off (bouncing them) please? Thank you! Solution - When using Scene in your project it's required to enable its capability in the project config. Ref: https://developer.oculus.com/documentation/unity/unity-scene-bouncing-ball-sample/2.6KViews0likes3CommentsHuge issue MQDH no longer casting with Quest Pro!
There is a huge issue going on with the MQDH and I seemly can't find a solution besides a couple other unsolved threads. I need to use the casting function of MQDH for mixed reality capture and it is no longer working. It used to work fine on both my laptop and pc, I did a whole review with it. I didn't touch the headset for a couple months and now it does not work on either. I've tried updating everything, rolling back the mdqh, uninstalling and reinstalling, re adding my device and factory resetting the headset and still nothing has resolved the issue. I'm stressed cause I'm on a deadline and this is literally stopping all progress. Also note the casting works fine through the oculus website but you can't do passthrough recordings there. It also casting no problem to sidequest, it's literally just the MQDH. I would use the other options if they were not terrible quality. I got the recording function to work, it only works with the recording option of "Both Eyes" selected besides that it will give me the error "device display is off". This is similar to what I've read in other unsolved threads. This is what it says exactly "An Error Occurred...Check to see if your headset and computer are using the same network. VPN’s have been known to cause casting errors. Try switching to the same network to allow casting.Firmware build ID / build flavor in headset can cause instability in casting. This information can be found in the tooltip if you hover over the device image on the device page.Check to see if the account on the headset is the same as the logged in account in MQDH. There is a known issue with secondary accounts logged into the headset that would conflict with the account that is logged into MQDH. This is a rare occurrence.Please try logging out and back in from the Settings Page to refresh your session" when I try to record I get: error: device display is off I have changed nothing since I used it last. Everything appears to be connected fine otherwise the casting simply does not work anymore and I'm out of ideas so any suggestions would be appreciated2.4KViews3likes4CommentsMeta Quest Pro Eye Tracking Enabling Issue
I am working on an application that retrieves the eye position of a user by using collision of a surface with the OVRGazeRay. I was able to make it work and could retrieve the data points over the meta link to my computer. At some point I updated my Quest Pro and from then on it stopped working. The Movement SDK tells me that the eye tracking is enabled and working In addition to that the Meta XR project setup tool also tells me that there are no issues in my project. I have enabled eye tracking everywhere it needs to be and I did first send a build of the project to my quest to be able to enable eye tracking for the application in the meta quest pro and have verified that from then on it is enabled in the settings when using the meta link. In addition to the eye tracking using FixedUpdate (I doubled the frequency of the update) I also have an additional C# script taking controls from the controllers and interacting with a TextMeshProUGUI which is not used for collision detection (the two are hence separate). Everything seems to work just fine but somehow since the update I cannot retrieve the data anymore. Hence I asked the API directly if the eye tracking is enabled using (OVRPlugin.eyeTrackingEnabled) and this tells me that it is not enabled. This results from the following code if (OVRManager.isHmdPresent) { OVRPlugin.SystemHeadset headset = OVRPlugin.GetSystemHeadsetType(); UnityEngine.Debug.Log("Headset type: " + headset); bool eyeTrackingSupported = OVRPlugin.eyeTrackingSupported; bool eyeTrackingEnabled = OVRPlugin.eyeTrackingEnabled; UnityEngine.Debug.Log("Eye Tracking Supported: " + eyeTrackingSupported); UnityEngine.Debug.Log("Eye Tracking Enabled: " + eyeTrackingEnabled); if (eyeTrackingSupported && eyeTrackingEnabled) { UnityEngine.Debug.Log("Eye tracking is supported and enabled."); } else if(eyeTrackingSupported) { UnityEngine.Debug.LogWarning("Eye tracking is not enabled."); } else if (eyeTrackingEnabled) { UnityEngine.Debug.LogWarning("Eye tracking is not supported"); } else { UnityEngine.Debug.LogWarning("Eye tracking is not supported and not enabled."); } } Things I have already checked multiple times: - Developer Mode is enabled on the Meta App on my phone for the Meta Quest Pro - I did factory reset the meta quest pro multiple times - I am using 2022.3.35f1 LTS Unity which is compatible last time I checked with the movement sdk and the meta sdk. - I am using up-to-date versions on all my packages - Before the update I was able to retrieve the eye tracking data and I have verified that it matches the exact eye pattern. I also know that no collision is occuring as the eye gaze ray would change from white to red were it to collide with something. I have been trying to figure this out for weeks now and I really do not know what the problem is being caused by. I hope someone can help me. Thanks in advance.1.7KViews0likes3CommentsHow to start Eye Tracking Calibration from an Application or ADB
Is it possible to start the Quest Pro eye tracking calibration flow (the one in Settings -> Movement -> Eye Tracking) in either of these ways?: Within an application (whether it is an OpenXR call, android intent, or Unity plugin) Through ADB (intent, executable; can use root permissions) If this is not possible, then I'd like to make this a feature request, as it would greatly improve our user experience 🙂1.7KViews4likes2CommentsHand grab interaction fails if OVRCameraRig is child of high speed moving object
Hi all, I'm facing an unexpected scenario while using the grab feature provided by the Meta Interaction SDK. It looks like that if the OVRCameraRig object (with OVRHands and/or OVRControllerHands properly configured with HandGrabInteractors) is set as child of a "moving object" (together with the objects designed to be grabbed), depending on the speed of this object the grab actions fail. The scenario I'm working on is a train cabin moving along binaries designed as a spline path, with the VR user/player (represented by the OVRCameraRig object) parented to this cabin, since it must be inside the moving train. Inside the cabin I have a lever that must be grabbed to modify the speed of the train. At slow speeds the lever can be grabbed without problems, while increasing a little bit the speed, the lever can't be grabbed anymore! I tested it multiple times, trying to understand the root cause of the failure. I guess it's something related to the frequency of checks for the hands vs the lever intersections/collisions made by the Meta scripts. However I couldn't find any solution. Is that something I'm missing in the documentation? Maybe some script property to be set or fine tuned? NOTE: I don't think it would be acceptable, in a 3D application, to invert the scenario where the train is still and the whole environment is moving towards it.... Of course in this case both the objects to grab and the "player" would be still and the grabbing action should work like a charm. But I still believe it is an ugly workaround. Is someone having any clue on how to fix this "misbehaviour"? Thanks in advance!1.4KViews0likes2CommentsDeveloper Mode Quest Pro
Hey 🙂 I bought a Meta Quest Pro because I would like to develop some small projects for these glasses. I'm also very excited about the device, but I've already encountered a more or (I hope) less big problem pretty quickly and hope you can help me here. The issue is that I can't get the glasses properly into Developer Mode and thus connected to my PC or the app. I have tried many different ways, downloaded software updates, used different cables, worked with the app and dev hub, run power cycles and rebooted all devices, but I have not been able to solve the problem. I always get to the step where I have to allow debugging via USB on the headset. I remember this from the Quest, where it worked fine, but on the Pro this prompt just doesn't appear for me to give my permission. After some searching, I came across a thread here that also describes exactly this problem and where in the further course, quite current is written that there is a solution for it...can you help me find this solution? I have already looked around a bit in the forums as I said and ended up here: https://communityforums.atmeta.com/t5/Get-Help/Quest-Pro-Developer-Mode/m-p/994506 I had then written to the support chat, as well as the mod from the forum entry, both of which unfortunately could not help me directly, but forwarded me to the PM of the meta support and they forwarded me to the consumer support and this page... i would really appreciate if someone can help me... It also works really everything great, except the last step "allow access". When I connect the headset via cable to the pc, there is a nice sound, but no window to confirm. Thank you very much and with kind regards Karo1.2KViews0likes2Comments