Noesis UI is now available in the Desktop Editor
We are excited to announce that Noesis, a 2D UI solution, is now integrated into the Worlds Desktop Editor so that creators can build more compelling UI and high-performance 2D UI panels, without needing to rely on coding. Some of the key features of Noesis include: Richer Featureset: Create a Custom UI experience for your world, without needing complex work-arounds. High-Performance Rendering: Immediate responses and fluid animation means your Custom UI will keep up with the rendering of every frame. Easier/Faster UI Design and Iteration: Noesis Studio provides creators with a WYSIWYG visual editor, instead of having to rely just on code for designing UIs In addition, the XAML you build today in Desktop Editor can be ported to the future Meta Horizon Studio with just some asset path and UI script adjustments so your work can be transferred and duplicated easily. How it works: Creators will build their 2D UI panels in Noesis Studio, import the Noesis project into the Desktop Editor, and then associate the right Root XAML file to the new NoesisUI component. Check the documentation here for information on creating a Noesis UI component, configuring your NoesisUI panel, setting up animations for your panels, performance considerations, and fonts. Want to see it in action? You can see the first live Noesis UI within Profit or Perish โ check the Profit or Perish Speech Dialogs to see it today, with more to come soon.267Views3likes0CommentsQuestion about Meta UI and returning to app
Hello, I am reaching out on behalf of one of our clients, who has reported the following: While running our application, their users may occasionally press the Meta button, which invokes the Meta UI and temporarily makes it so they cannot interact with our application until the Meta UI is closed out of. This part makes perfect sense. However, the issue is that they have reported difficulty in closing out the Meta UI and returning back to our application. Normally, users can either press the Meta button again (or press the 'Resume' button while the Meta UI is overlayed) to return back to the application. In our client's case, they state that they have to press the Meta button / press the 'Resume' button multiple times before the Meta UI finally responds to the action and closes out, returning to the application. In their troubleshooting, they could not find any consistent pattern that would indicate a condition that triggers this. For instance: They are using Quest 3 headsets Some of them are on Shared Mode or Kiosk Mode, while others are not Some of them are on an MDM solution while others are not The controllers do not have any noticeable debris on them and are regularly cleaned with approved electronic sanitization products (i.e. low moisture wipes and set to dry in a UV cart, not using any sprays) The controller batteries are also regularly charged before any session of our application is run Tracking frequency on controllers is set to 'auto' *Especially for the 2nd and 3rd point listed above about Shared/Kiosk Mode and MDM solution - it does not matter whether these headsets have these enabled. The issue occurs on headsets that have these enabled as well as those in which these are not applicable. Can you kindly please suggest additional troubleshooting suggestions that our client can try, as this issue is significantly impacting their use of the application? Thank you, and please let me know if there are additional details I can share.12Views0likes0CommentsObject placed above MRUK furniture "jumps" up/down when pushing right thumbstick
Context Unity + Meta XR Building Blocks. I'm building an AR app (Passthrough + MR Utility Kit). I show a world-space UI dialog above an MRUK table (Canvas in world space, RectTransform placed at the table center + slight lift to sit on the surface -> all via script). Symptom Whenever I push the right controller thumbstick downward, the dialog appears to "jump" ~0.5 m up, and pushing again makes it jump back down. This happened both on the device and in the Simulator. What it actually is It's not the dialog moving. Logging showed Camera.main.transform.position.y toggling between two values (1.047 <-> 1.547), while the dialog's world Y stayed constant.Solved40Views0likes1CommentHow to limit the plane surface to the bounds of the UI actually on screen?
I am using Oculus Integration SDK (yes, the legacy version currently due to needing to make changes to the Assembly Definitions), and I am making a flat canvas. I have placed a PlaneSurface script on the object holding the canvas component. I have a sibling object called "Surface" that I put components `ClippedPlanSurface` and `BoundsClipper` on. I dragged the Canvas object with `PlaneSurface` into the Clipped Plane Surface component's 'Plane Surface' field. Interaction works just fine ... however ... The issue is that now I have an infinite plane surface for ray interaction with world space UI, even though the flat panel is just a rectangle right in front of the player. This makes it so I am able to ray cast against an invisible plane even when there is no UI there. Can anyone help me make the BoundsClipper component work, or somehow to limit the plane surface to the bounds of the UI actually on screen?924Views0likes1CommentHow to manage ray interactions on multiple Unity Canvases on top of each other?
Hi, How do I manage disabling and enabling multiple world space canvases so the Ray interactions work properly, when the canvases are placed in front of each other? The UI canvases are default Unity world space canvases with ray interactions added with the quick actions described in this article: Add an Interaction with QuickActions | Oculus Developers The problem is that I have two worldspace canvases with ray interactions placed in front of each other, kinda like in layers. When I disable the canvas gameobject in front, the interactor cursor hits the disabled (invisible canvas) and does not properly pass through to hit the second canvas (which it should do). How can I ensure the cursor interacts with the first active/visible canvas and does not get captured by disabled ones?1KViews0likes4CommentsMeta Interaction SDK and Mouseclicks
I am using the Meta XR interaction Toolkit and its Components. My Problem: I am in Playmode on my PC and have a Oculus hooked up via Link. I have 2 Displays. Display 1 (VR) Where the the Player is in a room with an interactable Panel. (Grab, Point, Select works) Display 2(PC) I have an UI Button. When its clicked something should happen. As far as i know the Meta SDK needs the PointableCanvasModule for the Events to work. But then my Mouseclick on Display 2 doesnt register. When i have the Standalone Input Module active, my mouseclicks works, but i can't interact with the Canvas anymore. Video that shows the Problem: https://streamable.com/uin4ms How can i use my Mouseclick and the VR Hands at the same time? Big_Flex I read that you work on the Interaction SDK team, so i tagged you ๐ Thanks for the help545Views0likes0CommentsUnity VR UI canvas popup
How can I open a UI canvas panel when an object is clicked by VR controller in Unity VR? I want to implement this functionality where clicking on an object in a Unity VR environment triggers the opening of a UI panel. How can I achieve this? If you have any insights or suggestions, please share them. Thank you!1.6KViews0likes3CommentsGaze/Head Tracking Combined w/ Limited Hand Tracking or Gamepad Controller (Ex. Apple Vision Pro UI)
This implementation could drastically increase the ease of use of the Quest 3 UI, and bring it much closer to the usability of the Apple Vision Pro UI for a fraction of the cost, while still allowing the flexibility of trackable controllers, conventional hand tracking, and the use of gamepad controllers or other Bluetooth devices such as keyboards or mouses. This post is directed to other developers who may have ideas to suggest as a solution, or may be interested in incorporating some of these ideas into their projects. Related Post: https://communityforums.atmeta.com/t5/Get-Help/How-to-reposition-or-resize-windows-with-a-gamepad-controller/m-p/1190574#M300771 How to reposition or resize windows with a gamepad controller? (XBox One Controller) "I love the gaze cursor feature when using a controller. Lets me quickly take actions when I donโt want to use the Quest controllers. One thing bugs me though. Iโm unable to drag using the gaze cursor. If I hold the A button on an Xbox controller it will rapidly select an item instead of select and hold on to it. Are there any tricks around this?" This is directly copied from a reddit post by Independent_Fill_570 a month ago and it hasn't received any responses yet. https://www.reddit.com/r/MetaQuestVR/comments/1byjora/is_it_possible_to_resize_windows_with_a_contro... I'm having the same issue. I love the ability to use the gaze tracking for the cursor, but it restricts the ability to resize windows, reposition windows, long click (as it only repeatedly single clicks rapidly instead), so selecting text longer than a single word is also an issue. The gaze control seems to be the best substitute to the Apple Vision Pro's eye tracking cursor. Is there any way of using the gaze control to guide the cursor, but a hand tracking gesture to select or click, without it engaging as conventional hand tracking while gaze tracking is enabled? I've spent many hours now searching the internet, looking into potential sideloading options, and even pouring over the Oculus PC source code, but haven't really found anyone talking about this except for the one unanswered reddit post I've linked to. The XBox One controller has basically the same buttons available as the Quest 3 controllers do, minus the tracking, but with the gaze tracking it would be wonderful to have the controller buttons mapped properly and I can't seem to find a way to remap the gamepad keybinds without it being run through Steam and a PC link. I'd like to be able to do this natively on the Quest 3 standalone. Hand tracking only for the selecting or clicking would also be great, but even just the buttons being mapped properly so that pressing A clicks, and holding A holds the click until released would fix the issue. I am aware that pressing the select and menu buttons together toggles the gaze tracking off, and enables very limited use of the Xbox controller buttons for use in a keyboard or such, but that's not what I'm asking about here. Thanks in advance to anyone who has any helpful information to provide on this issue. If gaze tracking were combined with limited hand tracking gestures like making a closed fist, I feel like the quality of this product could more easily rival the user interface of the Apple Vision Pro.1.2KViews0likes0CommentsTrouble with UI in unity vr app
I am developing vr app with unity, everything works fine but i am having trouble with menu as i cant see menu in the app after doing some changes to other components and if i again bring new camerarig from working scene it works fine, i am not doing any change in camera rig but i dont know my menu is there as i inspect with remote inspector but could not see ui or menu until i change to new camera rig. Please help me with this Thanks in advance559Views0likes0Comments