Getting battery percentage for headset/controllers in Unity over Meta Link (Meta XR SDK)
Hi, I’m working with Unity + Meta XR SDK and using Meta Quest 3 via Meta Link (PC VR / OpenXR runtime). I would like to retrieve the battery percentage of the headset and controllers while running in PC VR mode (Link/Air Link). What I’ve tried: - SystemInfo.batteryLevel returns PC/laptop battery, not headset - OVRPlugin.batteryLevel marked as obsolete, and returns 0 when running via Link - OVRManager.batteryLevel same behavior as OVRPlugin However, the Meta Quest Link desktop and mobile app itself are able to display headset and controller battery information, so the data clearly exists in the system. Is there any supported way in Meta XR SDK / OpenXR / Oculus PC SDK to access: - Headset battery percentage - Controller battery level …while running in PC VR mode (Link/Air Link)? Or is this data intentionally not exposed to third-party applications on the SDK side? Any clarification from Meta or experienced XR developers will be appreciated. Thanks!17Views0likes0CommentsAccessing controller input in Meta Quest system UI overlays from custom Android-based engine?
We are building a custom engine on top of Android (using the SDK and Gradle, not Unity or Unreal) porting our products and targeting Meta Quest devices (Quest 3 specifically) for the 2D System panels / overlays (the floating windows in the Quest home environment): What we want: We need to capture *any form of controller input* (buttons, joystick, scroll) while the apps system UI panels are visible or focused. Key constraints: We are not trying to modify system UI, only observe or intercept input Even partial input (e.g. scroll events, pointer movement, or indirect signals) would be sufficient What we’ve tried: Standard Android input APIs (onKeyDown, onGenericMotionEvent) Checking for MotionEvent sources from controllers Polling input devices directly Combed through the SDK without luck Is there any supported way (SDK, API, or lower-level approach) to access or intercept controller input when system UI overlays are active on Meta Quest? Any guidance on how input routing works between apps and system UI on Quest would be helpful.63Views0likes3CommentsMetaXRInteraction missing Ray visuals for ISDKControllers, but ISDKHands working as intended.
I recently upgraded to the version 1.81.0 of MetaXRInteraction on the Oculus Unreal engine fork. Version 5.6.1 And the controller components don't have ray visuals when interacting with menus. The functionality works, I can click, and elements change when hovered, but it's hard to aim. What's weird, is that the ISDKHand components, do have proper ray visuals. (Expect for the right reticle not getting blue when pinching) I am not sure what is broken, or how to get this working. All parameters are defaults. And I only turn off/on the Ray and Grab components (for both ISDK Hands and ISDK Controllers) when the user selects a hand preference in settings using the Set Active function. Video demonstrating the issue. https://drive.google.com/file/d/1GZB-dCXpaxryZP92I9jXAnZO34Lg9WHP/view?usp=sharing Thanks!61Views0likes2CommentsHow to disable controller's auto-sleep?
Hello, I'm working on a project (PCVR) that continually reads coordinates from Quest Pro controllers (integrated cameras), all works fine in my side. My issue is that the controller automatically turns off (auto-sleep) after few minute if no movements detected, so, reading the controller's coordinates breaks. How to disable controller's auto-sleep? Thank you.63Views1like2CommentsDistanceGrabUseInteractable?
Hi, This is definitely a pretty basic question relating to the Meta Interaction SDK for Unity. I've managed to get a HandGrabUseInteractable linked to a HandGrabinteractable via a SecondaryInteractionFilter. However, I also have a DistanceHandGrabInteractable on that object, linked to its own HandGrabUseInteractable linked to the same delegate as the first. When I grab the object without distance grab, my script calls BeginUse, EndUse, and ComputeUseStrength properly. When I grab with distance, it does not, as far as I can tell - I am working on Mac and the simulator was not working with this scenario at all, so I have to port the APK to my quest each time I want to test. That takes away a bit of my debugging capabilities. I thought perhaps this was an issue with having multiple HandGrabUseInteractables, but when I removed the duplicate and made the object only have DistanceHandGrabInteractable and one HandGrabUseInteractable, it still did not work. I also wondered if perhaps HandGrabUseInteractable only supports the HandGrabInteractable, and not other HandGrabInteractables? But peeking at the package code and reading the SecondaryInteractionFilter docs seemed to suggest either HandGrabInteractable or DistanceHandGrabInteractable should work, so long as all references are piped correctly. What am I doing wrong? How can I link my DistanceHandGrabInteractable to a HandGrabUseInteractable? Will I need to make my own DistanceGrabUseInteractable script, perhaps using the existing HandGrabUseInteractable as a base? Thanks for the help22Views0likes0CommentsAccessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.141Views1like0CommentsMeta XR Simulator Standalone Help
I'm an educator teaching Unity & XR development using Quest 3 and Meta Building Blocks. But I have been really struggling because of the difference in learning materials online (from Unity's end, and from content creators online from even just 3 months ago, let alone 1 year). The most current / pressing issue in my class is the lack of updated documentation and examples using the new standalone version of the Meta XR Simulator. Half the documents in the official Meta XR Simulator Overview documentation are from 2024 and use the old interface (which had WAY more features and customization options). I have a bunch of students relying on mouse and keyboard controls trying to test behaviors like the locomotion building block, but they don't work. Current issues I would love suggestions or hints on how to solve (from just importing Building Blocks into a Unity Core 3D scene, nothing customized yet): I have duplicate controller models and ghosting (only in the simulator, visible when moving) I have weird graphical glitches sometimes that look like snow or fuzz (only in the Unity game view & simulator when running the simulator) I cannot get rays or aiming reticles to come from the controllers no matter where they or my mouse are pointing (but they work in the headset). Even with point and click on. Do the movement inputs (default WASD and Arrow Keys) simulate the left and right joystick? Or do they override/bypass those inputs? Some teleport control options involve aiming and pressing up on the joystick, and I'm not sure how to test that in the simulator Is there some way to add simulation input options that actually trigger the controller's inputs like the Unity package version used to? I would also appreciate any general advice or resources on new/recent best practices, customization options, and debugging tips using the building blocks and interaction SDK.159Views0likes1CommentJoystick Deadzone Settings Reset After Controller Reconnection
Device: Meta Quest 3 OS Version: v83 (latest at time of testing) Issue Description: The system-level joystick deadzone settings do not persist after the controller is reconnected. Any form of controller reconnection causes the deadzone to stop working. This includes controller disconnect and reconnect, restarting the headset, and powering the headset off and on. After reconnection, the deadzone setting still appears enabled in the system menu, but it no longer has any effect on joystick input. To restore correct behavior, the deadzone must be manually adjusted or re-saved every time, which is inconvenient and negatively impacts usability. Steps to Reproduce: First, set a custom joystick deadzone in the system settings. Next, disconnect the controller or restart the headset. Then reconnect the controller. After reconnection, observe that the deadzone no longer takes effect. Expected Behavior: Joystick deadzone settings should persist and remain functional after any controller reconnection. Actual Behavior: The deadzone becomes ineffective until it is manually reset. Impact: This issue affects basic input accuracy and requires frequent manual reconfiguration, significantly degrading the user experience. This issue was reported previously over a year ago, and multiple users have encountered the same behavior. However, it still appears to be unresolved in current system versions, so I am reporting it again for visibility and tracking.69Views0likes1Comment