Accessing controller input in Meta Quest system UI overlays from custom Android-based engine?
We are building a custom engine on top of Android (using the SDK and Gradle, not Unity or Unreal) porting our products and targeting Meta Quest devices (Quest 3 specifically) for the 2D System panels / overlays (the floating windows in the Quest home environment): What we want: We need to capture *any form of controller input* (buttons, joystick, scroll) while the apps system UI panels are visible or focused. Key constraints: We are not trying to modify system UI, only observe or intercept input Even partial input (e.g. scroll events, pointer movement, or indirect signals) would be sufficient What we’ve tried: Standard Android input APIs (onKeyDown, onGenericMotionEvent) Checking for MotionEvent sources from controllers Polling input devices directly Combed through the SDK without luck Is there any supported way (SDK, API, or lower-level approach) to access or intercept controller input when system UI overlays are active on Meta Quest? Any guidance on how input routing works between apps and system UI on Quest would be helpful.45Views0likes3CommentsUE5.5 VR App on Meta Quest 3 Not Receiving Bluetooth Keyboard Input
Environment Unreal Engine 5.5 Meta Quest 3 Using the VR Template (OpenXR) The application starts directly in VR Test Setup Input is handled using either Enhanced Input or Input Key events When a specific key is pressed, a corresponding number is displayed using the Text 3D plugin Issue A Bluetooth keyboard is successfully connected to the Meta Quest 3 The keyboard works correctly in system UI (e.g., search fields) However, no key input is received inside the Unreal Engine APK (in VR mode) What I Have Tried Android Manifest modifications Added the following via ManifestRequirementsAdditions.txt: <uses-feature android:name="android.hardware.keyboard" android:required="true"/> Added the following via ManifestApplicationActivityAdditions.txt: android:configChanges="keyboard|keyboardHidden|navigation" Project Settings adjustments Disabled Enable Improved Virtual Keyboard Permissions added android.permission.BLUETOOTH_CONNECT android.permission.BLUETOOTH_SCAN android.permission.BLUETOOTH_ADMIN Set Input Mode Game and UI Widget to Focus: NoneMouse Lock Mode: Do Not Lock Current Situation No key events (including basic keys like letters or numbers) are detected in the application This applies to both Enhanced Input and traditional Input Key events Question Is there any additional setup required for receiving Bluetooth keyboard input on Meta Quest 3 in Unreal Engine (specifically in VR / OpenXR)? Or are there known limitations for keyboard input in VR mode on Quest devices?16Views0likes1Comment(Resolved) Meta Quest mobile app disables Oculus Go. Setup stucks at "Health and Safety".
Update March 27th: The build 260 is available. https://apps.apple.com/us/app/meta-quest/id1366478176 ----------- Update March 25th: Meta Store support told me that the latest mobile app Version 259 was released with a fix to resolve the issues described in this thread. So, the resolution was given officially. Thank you Meta for the fix, and thanks everyone in this thread as your voices indicated how this issue was significant. 😄 ---------- During setup of Oculus Go by Meta Quest for iOS Version 257.0.0.9.106, I could not play "Health and safety" video, although I pressed "Watch Video" button. As a result, I cannot complete setup my Oculus Go. How do I proceed with the next steps? I also tried the setup by Meta Quest for Android, but the symptom is all the same. Version 258.0.0.9.109 released on Marth 13th also show the same symptoms. Still the setup fails and the headset becomes useless. I can click "Continue" button. But all the three links are inactive. As the video never starts, I cannot complete the setup. My headset becomes a photoviewer of single image prompting me to grab smartphone.Solved41KViews12likes145Comments[Audit] OS 2.1 Nav Bar: AOSP Java Implementation & The Sideloading Paradox
As a Web/Multimedia Developer (B.A. Comm), I am formally auditing the persistent UI limitations in OS 2.1. Meta continues to treat the Global Navigation Bar and System Menus as a static kiosk interface. The AOSP / Java Reality The Navigator is a View group within a Java-based AOSP (Android Open Source Project) framework. There is zero architectural excuse for failing to implement transparency (alpha) and color customization. If the community can port Quake to HTML5, Meta's engineering team can certainly implement a basic slider for the UI layer. The Sideloading Paradox The claim that UI customization is "too complex" for users is logically inconsistent. A significant portion of this community is already using ADB (Android Debug Bridge) and Sideloading to bypass these restrictions. If a user has the technical literacy to sideload an APK, they are more than capable of using a native hex-code color picker or a transparency toggle. Figure-Ground and Accessibility By hard-coding Elevation and Ambient/Key shadows without user-definable variables, Meta has created a permanent Figure-Ground failure. A professional-grade OS requires user-defined transparency to ensure visual accessibility. Conclusion: We are developers and owners, not tenants. Meta needs to stop gatekeeping basic UI variables. Implement the alpha sliders for the Navigation Bar and Menu.52Views0likes3CommentsCorrect Unity XR rig and scene management for Meta Quest applications
Hello everyone. I'm currently working on a game, and I'm encountering issues as per the title. I've gotten a bit further along with the project, but I'll cut the problem down to a specific and limited space. Let's say we have a Unity project with the XR Core, Interaction Toolkit and Plugin Management packages installed, and we have 3 empty scenes with just a UI button which will take the player to the next scene in a loop, without breaking XR tracking or having any other issues for when the game is running on a Meta Quest 3 as a side-loaded app, using OpenXR as the Plug-In Provider. 1) What are the requirements, assumptions and best practices for the XR Rig in this situation? I've found that if the XR Rig isn't a part of the first scene, the video output will freeze after the splash screen has finished, even if one is supposed to be instantiated by a manager script upon scene load. 2) Once this XR Rig is instantiated, it must be marked as Don't Destroy On Load and maintained for the rest of the app's lifetime, correct? One cannot just have a "scene copy" of the XR rig in all scenes? If there is a requirement to have this XR Rig persist, what is the recommended pattern and approach to managing this rig across scene transitions? 3) Should one be using individual scene loading, or additive loading and subtracting of scenes? Eg, going from the main menu to the gameplay. 4) If this XR Rig must be maintained, how should I handle it being parented and unparented to things such as vehicles, and to ensure that it's separated and maintained upon scene exit? Finally, yes I'm aware of the Meta SDK, but for now and for my own proper understanding of how things should work, I'd like to stick with baseline elements. So please don't tell me to "just use the Camera Rig Building block"!36Views0likes1CommentRayban meta GEN2 and Oakely Vanguard NOT Playing WhatsApp Audio Message - Android
Rayban meta GEN2 ISSUE with Playing WhatsApp Audio Message - same issue with my other pairs of Oakely Vanguard. I've two pair of RayBan meta gen2 "Headliner + Wayfarer", and few days ago the Headliner start the issue and today also the Wayfarer. Each is connected to a different phone with different setups, but now both RayBan, when asked to play the just received WhatsApp voice message, they are muted, like they are playing a mute message... It never happened before, all the audio commands and speakers works good on other applications... We reset both glasses, reconnect, ans troubleshooting but nothing changes... Anyone has the same issue?26Views0likes1Comment