OATH.Studio
14 days agoHonored Guest
Accessing controller input in Meta Quest system UI overlays from custom Android-based engine?
We are building a custom engine on top of Android (using the SDK and Gradle, not Unity or Unreal) porting our products and targeting Meta Quest devices (Quest 3 specifically) for the 2D System panels / overlays (the floating windows in the Quest home environment):
What we want:
We need to capture *any form of controller input* (buttons, joystick, scroll) while the apps system UI panels are visible or focused.
Key constraints:
- We are not trying to modify system UI, only observe or intercept input
- Even partial input (e.g. scroll events, pointer movement, or indirect signals) would be sufficient
What we’ve tried:
- Standard Android input APIs (onKeyDown, onGenericMotionEvent)
- Checking for MotionEvent sources from controllers
- Polling input devices directly
- Combed through the SDK without luck
Is there any supported way (SDK, API, or lower-level approach) to access or intercept controller input when system UI overlays are active on Meta Quest?
Any guidance on how input routing works between apps and system UI on Quest would be helpful.