How to disable controller's auto-sleep?
Hello, I'm working on a project (PCVR) that continually reads coordinates from Quest Pro controllers (integrated cameras), all works fine in my side. My issue is that the controller automatically turns off (auto-sleep) after few minute if no movements detected, so, reading the controller's coordinates breaks. How to disable controller's auto-sleep? Thank you.33Views1like2CommentsDistanceGrabUseInteractable?
Hi, This is definitely a pretty basic question relating to the Meta Interaction SDK for Unity. I've managed to get a HandGrabUseInteractable linked to a HandGrabinteractable via a SecondaryInteractionFilter. However, I also have a DistanceHandGrabInteractable on that object, linked to its own HandGrabUseInteractable linked to the same delegate as the first. When I grab the object without distance grab, my script calls BeginUse, EndUse, and ComputeUseStrength properly. When I grab with distance, it does not, as far as I can tell - I am working on Mac and the simulator was not working with this scenario at all, so I have to port the APK to my quest each time I want to test. That takes away a bit of my debugging capabilities. I thought perhaps this was an issue with having multiple HandGrabUseInteractables, but when I removed the duplicate and made the object only have DistanceHandGrabInteractable and one HandGrabUseInteractable, it still did not work. I also wondered if perhaps HandGrabUseInteractable only supports the HandGrabInteractable, and not other HandGrabInteractables? But peeking at the package code and reading the SecondaryInteractionFilter docs seemed to suggest either HandGrabInteractable or DistanceHandGrabInteractable should work, so long as all references are piped correctly. What am I doing wrong? How can I link my DistanceHandGrabInteractable to a HandGrabUseInteractable? Will I need to make my own DistanceGrabUseInteractable script, perhaps using the existing HandGrabUseInteractable as a base? Thanks for the help20Views0likes0CommentsRecentering gesture for impaired users
I am building a hand-control-based VR app for users with impaired mobility. I have two challenges related to the pinch-and-hold gesture for recentering. For some users the gesture is exceedingly hard or impossible to perform (false negative). For others, the gesture is sometimes triggered accidentally (false positive). I understand Meta's desire to keep this gesture universal across all third-party apps. Unfortunately, it is not universally viable to all users. I need a solution to this problem or my app will never ship. I am prepared to roll my own recentering system that manipulates the in-game view in response to a hardware "easy button" press. However, to implement this solution I still need to know when an actual pinch-and-hold gesture is performed, so that I can properly recalibrate my own system. Unfortunately I have not found any functioning API or telemetry that might hint that this has happened. I have tried several OpenXR and Meta Core APIs but they all seem to be no-ops on the Quest 3. Can anyone recommend a solution? I'm using Unity 6.3, OpenXR, and the Meta Core SDK. I do not depend on any other Meta SDKs but am willing to add them if they solve this problem.16Views0likes0CommentsAccessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.125Views1like0CommentsPlaymode tests in Azure pipeline crashes because MetaSDK
Hello, Adding Meta SDK makes my pipeline to fail because of a crash. If I run the tests opening Unity in the Agent, they run without any issue If I run the script in my personal PC, it also works This is the line "C:/Program Files/Unity/Hub/Editor/2022.3.71f1/Editor/Unity.exe" -batchmode -runTests -testPlatform PlayMode This is the crash log28Views0likes0CommentsScreen went dark when testing a Unity build
Hi there, I'm testing a build of my Unity project on the Quest 3 (installed through the MQDH), and after a few minutes, the HMD just went dark (while the headset is on my head and with me actively testing my app) with what I believe is the "sleep mode" sound effect (the "shooooo" sound like you are moving away). If I press the Meta button on my right controller, the screen lights back up, and I can resume. I tested running the Horizon World to rule out the proximity sensor's problem, and I stayed in a world for 30 minutes without the screen going dark. I believe it might be some specific settings in my Unity project. Is it possible to log what's making my HMD go to sleep (that's what I'm assuming when the screen goes dark) and monitor it through ADB?39Views0likes2CommentsFeature request - Expose Android Surface swapchain frames to GPU (Vulkan) for compute processing
Hello Meta XR / OpenXR Runtime Team, I’m developing a Quest 3 OpenXR 8K video player using Vulkan. It uses MediaCodec to decode video frames directly into a runtime-owned Android Surface swapchain (XR_KHR_android_surface_swapchain) in a zero-copy pipeline. This works well for playback, but it prevents the application from running any GPU processing on the decoded frames (e.g., sharpening, denoise, chroma upsampling, lightweight super-resolution) before the runtime compositor samples the surface for an XrCompositionLayerEquirect2KHR layer. Feature request: I’d like to request an official, supported way for applications to access the decoded frame content on the GPU (Vulkan) prior to composition, while retaining the benefits of the Android Surface swapchain decode path. The current copy/blit workaround is not acceptable because it adds significant memory bandwidth overhead and defeats the purpose of the zero-copy pipeline. Sincerely, Zurab Kargareteli - Graphics Engineer Vrex Immersive zurabkargareteli@vreximmersive.com88Views1like2CommentsOVROverlayCanvas broken across whole project
I decided to try out OVROverlayCanvas with a small test, and sure enough, it turned my fuzzy UI into crisp, perfect quality UI. So I made the necessary changes across the entirety of my project. Things were looking great in the build. Then, for no apparent reason, every instance of OVROverlayCanvas across multiple scenes broke. They went from looking like this: …to looking like this while running in a build: Between when it was working and when it failed, the scene from which I've taken the above screenshots, has had no changes at all. Any suggestions? I’m working in Unity 6.3.1f1 with Meta XR SDKs v 83.0.1, OpenXR Plugin v.1.16.1363Views0likes0CommentsX-Plane 12 does not natively support hand or finger tracking for the Quest 3 at this time.
I would like to be able to see my hands in the X-Plane 12 simulation software, unfortunately OpenXR is not supported for the Quest 3. X-Plane 12 does not natively support hand or finger tracking for the Quest 3 at this time. Any update here is welcome63Views0likes1Comment