Gaze/Head Tracking Combined w/ Limited Hand Tracking or Gamepad Controller (Ex. Apple Vision Pro UI)
This implementation could drastically increase the ease of use of the Quest 3 UI, and bring it much closer to the usability of the Apple Vision Pro UI for a fraction of the cost, while still allowing the flexibility of trackable controllers, conventional hand tracking, and the use of gamepad controllers or other Bluetooth devices such as keyboards or mouses. This post is directed to other developers who may have ideas to suggest as a solution, or may be interested in incorporating some of these ideas into their projects. Related Post: https://communityforums.atmeta.com/t5/Get-Help/How-to-reposition-or-resize-windows-with-a-gamepad-controller/m-p/1190574#M300771 How to reposition or resize windows with a gamepad controller? (XBox One Controller) "I love the gaze cursor feature when using a controller. Lets me quickly take actions when I don’t want to use the Quest controllers. One thing bugs me though. I’m unable to drag using the gaze cursor. If I hold the A button on an Xbox controller it will rapidly select an item instead of select and hold on to it. Are there any tricks around this?" This is directly copied from a reddit post by Independent_Fill_570 a month ago and it hasn't received any responses yet. https://www.reddit.com/r/MetaQuestVR/comments/1byjora/is_it_possible_to_resize_windows_with_a_contro... I'm having the same issue. I love the ability to use the gaze tracking for the cursor, but it restricts the ability to resize windows, reposition windows, long click (as it only repeatedly single clicks rapidly instead), so selecting text longer than a single word is also an issue. The gaze control seems to be the best substitute to the Apple Vision Pro's eye tracking cursor. Is there any way of using the gaze control to guide the cursor, but a hand tracking gesture to select or click, without it engaging as conventional hand tracking while gaze tracking is enabled? I've spent many hours now searching the internet, looking into potential sideloading options, and even pouring over the Oculus PC source code, but haven't really found anyone talking about this except for the one unanswered reddit post I've linked to. The XBox One controller has basically the same buttons available as the Quest 3 controllers do, minus the tracking, but with the gaze tracking it would be wonderful to have the controller buttons mapped properly and I can't seem to find a way to remap the gamepad keybinds without it being run through Steam and a PC link. I'd like to be able to do this natively on the Quest 3 standalone. Hand tracking only for the selecting or clicking would also be great, but even just the buttons being mapped properly so that pressing A clicks, and holding A holds the click until released would fix the issue. I am aware that pressing the select and menu buttons together toggles the gaze tracking off, and enables very limited use of the Xbox controller buttons for use in a keyboard or such, but that's not what I'm asking about here. Thanks in advance to anyone who has any helpful information to provide on this issue. If gaze tracking were combined with limited hand tracking gestures like making a closed fist, I feel like the quality of this product could more easily rival the user interface of the Apple Vision Pro.1.2KViews0likes0CommentsIs there a Meta/Quest UI Kit?
Is there a UI kit available for the Quest system UI? This would be great for prototyping/designing UI to be more consistent with the system UI. Is there anything like this available to developers? I haven't found anything in the dev resources section. Thanks.2KViews1like3CommentsOculus Quest, UI and Unity : No hover event, even on the example scene
Hi ! I'm trying to understand how to whole UI interaction is working with the Oculus Quest using the sample scenes from the Oculus integration package. Everything's working fine except one thing : When playing the UI demo scene on the Oculus Quest, it seems that I'm never triggering the "hover" event of the UI buttons with the laser (the highlight color is never applied). It does work fine on the editor using the CV1. I can easily reproduce the issue on any scene. The weird part is the click event is working normally. I've tried messing around with a canvas and getting the "IPointerEnterHandler" event to trigger, but again same behavior : It's working fine on the editor, but I'm never getting the event on the Quest. I've never had this issue on the Oculus Go. There's no error in the logs from the Oculus Monitor. I'm using the latest Oculus integration, and the recommanded latest Unity version 2019.1.2f1 Is this a known issue ? Thanks !2.2KViews0likes3CommentsAssets/Oculus/VR/Scenes/UI.unity Input field not working
Hi Assets/Oculus/VR/Scenes/UI.unity is a scene from the Integration package. Unfortunately the input field does not work , the keyboard does not pop up as it should. Does anyone know how to fix this? I am trying to have an UI input filed and have the Quest user select it and fill out a name I could not locate any other example in the integration package Thanks in advance Dimi595Views0likes0CommentsOVR UI Helpers - Raycast LaserPointer and cursor not displayed in Quest build from Unity 2019.3.0f1
Hi all, really hoping someone might have experienced this same problem and found a fix, as it's really frustrating to not understand WHY this is happening. I'm Using Oculus Link and Unity to iteratively develop and test (using the play button in unity, which automatically plays on the USB connected Quest). I'm following the guidance for using UI helpers on a canvas object and have had no problems with being able to interact with all aspects of my UI using the touchcontrollers and the included raycast and linerender "Laser Pointer" script. The problem i'm having is when i come to do a build. Everything builds fine, with no errors but when I play it on the quest, the laser pointer/ line renderer not only doesnt show up it doesnt give me any interation with the UI when running on the Quest. Even stranger is the fact that when I plug my Quest back into the PC and start developing again in Unity, the problem then appears there. None of the UI interaction is working and the line renderer doesnt show up !? The only way to get the functionality back is to exit Unity and reload the scene. Everything then goes back to normal.... until I do a build again!? This may well be a Unity issue, I'll be asking kindly on their forums too, but would appreacite any advice or if anyone else has experienced these kinds of UI interaction problems with a Quest Build? Thanks in Advnace Andi7.9KViews0likes5CommentsUnity EventSystem IsPointerOverGameObject doesn't work with OVR Input Module?
Hello, in Unity you can use the EventSystem IsPointerOverGameObject method to check if the input is over a (UI) GameObject, but this method doesn't seem to work with the OVR InputModule and VR Controller. I confirmed that the same check is working with the default Input Module and Mouse Input. I also confirmed that the VR Input with OVR InputModule works correctly, I can interact with the UI itself with the Quest controller etc. but no matter what I do this method will always return false while using VR controllers even when the ray is clearly over an UI GameObject. Is that a bug caused by the OVR Input Module implementation or is there a better way to check if a input is over an UI object which works with Mouse and VR controller? Im using Unity 2020.3.24f1 and the Oculus packag version 35.01.9KViews0likes0CommentsHow to use graphics raycaster with physics raycaster?
I'd like to be able to interact with UI elements and 3D objects together in the same scene. I have the OVRRaycaster on my canvas, and OVRPhysicsRaycaster on the OVRCameraRig. Both work correctly by themselves, but the Physics Raycaster will still hit an object even if there is a canvas in front of it. I've played around with the blocking objects, blocking masks, and other settings. I've tried EventSystem.current.IsPointerOverGameObject() but it didn't seem to do anything. And I've tried to tweak the OVRPhysicsRaycaster script to no avail. From what I've read, the Physics Raycaster doesn't detect any graphic elements, so potentially Graphic Raycasts should be done first, and if nothing is hit, then call Physics Raycaster? Does the Oculus SDK have a solution for this or am I better off writing my own system?3.5KViews0likes2Comments[Unity] Reproducing controller system from main Oculus menu
Hi, In the Oculus quest's menu any button triggered switch the main active controller (Left or Right). I would like to implement the same system in my Unity application. I've tried to use OVRInput.GetConnectedControllers() and OVRInput.GetActiveController() but both returned "Touch" value. Nevertheless, I have set each Controller prefab's OVRTrackedRemote script to LTouch and RTouch, I would expect those method to return at least one of those value. I've also tried OVRPlugin.GetDominantHand() and the result was always the right hand. Did anyone here managed to reproduce in Unity how the controllers work in the Oculus menu ? Thanks in advance.586Views0likes0CommentsOculus Quest 3 Dot Loading in Unity App
Unity 2019.1.3f1 and most recent Oculus SDK I built out a quick UI where the user is required to interact with this keypad. They need to click on the buttons to enter the code and press enter. Except every time a button is clicked, three dots appear and the headset takes has to load. Not sure why as this has worked on every other headset with no issues. Would love some feedback!3.4KViews2likes5Comments