cancel
Showing results for 
Search instead for 
Did you mean: 

Access to system-level overlay capabilities for real-time subtitles

koichi.tokuda.osaka
Honored Guest
I work with students who are deaf or hard of hearing.
 
As immersive technologies like Meta Quest become increasingly common in educational settings, ensuring accessibility for students with sensory disabilities has become a critical challenge. In our classrooms, we are actively exploring the use of VR as a learning environment, but we have encountered a significant barrier: the inability to display real-time subtitles (closed captions) across VR applications for deaf students.
 
To address this, we propose a system that integrates with AmiVoice, a speech-to-text API widely used in Japan, to generate real-time captions. The challenge lies in rendering these captions as a persistent, head-locked overlay that stays visible regardless of which VR application is currently running.
 
This would require access to system-level overlay capabilities, or the use of special APIs not currently available in the public Meta SDK.
 
We respectfully request:
- Guidance on how to pursue official development under Meta's accessibility framework
- Information about partnership programs or research collaboration pathways
- (If possible) Early access to APIs or tools that enable system-wide persistent overlays
 
We believe this project has the potential to greatly enhance educational accessibility not only in Japan but worldwide, especially in special needs and inclusive classrooms. We would welcome the opportunity to collaborate, provide test cases, and participate in pilot programs if available.
 
Thank you very much for your time and consideration.
0 REPLIES 0