Forum Discussion

OATH.Studio's avatar
OATH.Studio
Honored Guest
14 days ago

Accessing controller input in Meta Quest system UI overlays from custom Android-based engine?

We are building a custom engine on top of Android (using the SDK and Gradle, not Unity or Unreal) porting our products and targeting Meta Quest devices (Quest 3 specifically) for the 2D System panels / overlays (the floating windows in the Quest home environment):

What we want:

We need to capture *any form of controller input* (buttons, joystick, scroll) while the apps system UI panels are visible or focused. 

Key constraints:

  • We are not trying to modify system UI, only observe or intercept input
  • Even partial input (e.g. scroll events, pointer movement, or indirect signals) would be sufficient

What we’ve tried:

  • Standard Android input APIs (onKeyDown, onGenericMotionEvent)
  • Checking for MotionEvent sources from controllers
  • Polling input devices directly
  • Combed through the SDK without luck

Is there any supported way (SDK, API, or lower-level approach) to access or intercept controller input when system UI overlays are active on Meta Quest?

Any guidance on how input routing works between apps and system UI on Quest would be helpful.

3 Replies

  • Degly's avatar
    Degly
    Start Partner

    Unless this has changed lately, no, this isn’t supported on Quest.

    When system UI (2D panels / overlays) is focused:

    • Input is owned by the system compositor, not your app
    • Apps do not receive controller events (buttons, joystick, scroll)
    • Standard Android input APIs won’t fire because your activity isn’t the active input target

    There’s also:

    • No public (outside of any business type) API to observe/intercept system-level input
    • No “partial input” access (pointer/scroll) while overlays are active

    This is by design for security and UX consistency.

    If you need input:

    • Your app must be the active/focused surface (immersive VR or foreground 2D)
    • Maybe contact Meta for an enterprise solution API
    • OATH.Studio's avatar
      OATH.Studio
      Honored Guest

      Thank you for your response, I just want to confirm. 

      No public (outside of any business type) API to observe/intercept system-level input

      If the composer is passing events am I correct in thinking I can override these events to capture them, I only need directional translation (up, down, left, right) and one input, on a focused window.  Eg: 

      https://developers.meta.com/horizon/documentation/android-apps/input/

→ Find helpful resources to begin your development journey in Getting Started

→ Get the latest information about HorizonOS development in News & Announcements.

→ Access Start program mentor videos and share knowledge, tutorials, and videos in Community Resources.

→ Get support or provide help in Questions & Discussions.

→ Show off your work in What I’m Building to get feedback and find playtesters.

→ Looking for documentation?  Developer Docs

→ Looking for account support?  Support Center

→ Looking for the previous forum?  Forum Archive

→ Looking to join the Start program? Apply here.

 

Recent Discussions