Using the latest language model from Meta create a new focused mode which is voice controlled. This mode can be invoked manually by prompt or via user interface shortcut on the main Menu.
VR is the ideal use case for an entirely Voice navigated UI since our hands are holding controllers and it can be impractical to switch between physical keyboards and motion controllers for productivity without switching to hand tracking with a virtual keyboard for a natural experience.
How it could work
This mode would automate actions when prompted to perform a task.
Detailed Description
1. User Interface
It would switch to focused mode via an overlay or modal dialog on top of the standard UI when invoked by voice prompt or clicked manually.
2. Voice text entry
Voice input for any text fields in applications and web browsers regardless of implementation e.g. WebView, WebXR, OpenVR, OpenXR, game engine.
3. Language model assisted navigation
Ability to search for content by voice prompt or automate the actions based on responses given from the new mode.
These would not require any further input except requesting the user for confirmation for performing suggested actions on the device.
4. Refined Querying
If the query is not understood because of ambiguity, it will ask the user to refine the query e.g. in the instance when an application is installed locally but something by the same name exists online, it may ask "Should I perform commands on A or B item?"
5. Automated Scripted Actions
6. Offline basic voice functionality
When there is no Internet it could fallback to an offline mode which can still handle voice input, voice navigation and automate setting changes. With the full experience available when connectivity is restored.
7. Fast Updates
This can be implemented as a service or separate application that is not dependent on any version of Android. This makes it easier to upgrade to newest versions of Android for Meta Quest.
Since the mode would not be tied to the operating system, it would be possible to update this component via the Meta Quest app store so the update cycle can be iterative and frequent.
8. Developer API's and Integration
Integration of this feature would be facilitated by requiring accessibility Metadata to be added to user controls for apps to be accepted in the app store or by creating a new API that enables this. This would allow the feature to automate text entry in text box controls and comprehension of the use case and context for correct action.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.