Object placed above MRUK furniture "jumps" up/down when pushing right thumbstick
Context Unity + Meta XR Building Blocks. I'm building an AR app (Passthrough + MR Utility Kit). I show a world-space UI dialog above an MRUK table (Canvas in world space, RectTransform placed at the table center + slight lift to sit on the surface -> all via script). Symptom Whenever I push the right controller thumbstick downward, the dialog appears to "jump" ~0.5 m up, and pushing again makes it jump back down. This happened both on the device and in the Simulator. What it actually is It's not the dialog moving. Logging showed Camera.main.transform.position.y toggling between two values (1.047 <-> 1.547), while the dialog's world Y stayed constant.Solved56Views0likes1CommentHow to limit the plane surface to the bounds of the UI actually on screen?
I am using Oculus Integration SDK (yes, the legacy version currently due to needing to make changes to the Assembly Definitions), and I am making a flat canvas. I have placed a PlaneSurface script on the object holding the canvas component. I have a sibling object called "Surface" that I put components `ClippedPlanSurface` and `BoundsClipper` on. I dragged the Canvas object with `PlaneSurface` into the Clipped Plane Surface component's 'Plane Surface' field. Interaction works just fine ... however ... The issue is that now I have an infinite plane surface for ray interaction with world space UI, even though the flat panel is just a rectangle right in front of the player. This makes it so I am able to ray cast against an invisible plane even when there is no UI there. Can anyone help me make the BoundsClipper component work, or somehow to limit the plane surface to the bounds of the UI actually on screen?979Views0likes1CommentHow to manage ray interactions on multiple Unity Canvases on top of each other?
Hi, How do I manage disabling and enabling multiple world space canvases so the Ray interactions work properly, when the canvases are placed in front of each other? The UI canvases are default Unity world space canvases with ray interactions added with the quick actions described in this article: Add an Interaction with QuickActions | Oculus Developers The problem is that I have two worldspace canvases with ray interactions placed in front of each other, kinda like in layers. When I disable the canvas gameobject in front, the interactor cursor hits the disabled (invisible canvas) and does not properly pass through to hit the second canvas (which it should do). How can I ensure the cursor interacts with the first active/visible canvas and does not get captured by disabled ones?1KViews0likes4CommentsMeta Interaction SDK and Mouseclicks
I am using the Meta XR interaction Toolkit and its Components. My Problem: I am in Playmode on my PC and have a Oculus hooked up via Link. I have 2 Displays. Display 1 (VR) Where the the Player is in a room with an interactable Panel. (Grab, Point, Select works) Display 2(PC) I have an UI Button. When its clicked something should happen. As far as i know the Meta SDK needs the PointableCanvasModule for the Events to work. But then my Mouseclick on Display 2 doesnt register. When i have the Standalone Input Module active, my mouseclicks works, but i can't interact with the Canvas anymore. Video that shows the Problem: https://streamable.com/uin4ms How can i use my Mouseclick and the VR Hands at the same time? Big_Flex I read that you work on the Interaction SDK team, so i tagged you 🙂 Thanks for the help557Views0likes0CommentsUnity VR UI canvas popup
How can I open a UI canvas panel when an object is clicked by VR controller in Unity VR? I want to implement this functionality where clicking on an object in a Unity VR environment triggers the opening of a UI panel. How can I achieve this? If you have any insights or suggestions, please share them. Thank you!1.7KViews0likes3CommentsTrouble with UI in unity vr app
I am developing vr app with unity, everything works fine but i am having trouble with menu as i cant see menu in the app after doing some changes to other components and if i again bring new camerarig from working scene it works fine, i am not doing any change in camera rig but i dont know my menu is there as i inspect with remote inspector but could not see ui or menu until i change to new camera rig. Please help me with this Thanks in advance567Views0likes0CommentsAdding ray pointer from Oculus Hands to interact with UI
I am trying to add a ray pointer from Oculus Hands to interact with UI. I managed to get the ray as shown in the screen capture below. However, as you can see, the laser pointer starts from the wrist position and aligned towards the right instead of pointing straight. Appreciate your suggestion if you know how can I correct this to point forward? Also, this laser pointer appears only on the right hand. Is there a way to change this? Please see my settings in Unity below.4.8KViews1like5CommentsUI button triggers not functional in build, but work as intended in editor
I am currently working to translate a VR game to the Oculus Quest 2 from a PC standalone version. In this game, I have created a VR menu with UI buttons that include box colliders with triggers active. These triggers have a script attached that when touched by the player, it will simulate the button being clicked. private void OnTriggerEnter(Collider other) { if (other.gameObject.tag == "VRController" || other.gameObject.tag == "LeftHand" || other.gameObject.tag == "RightHand") { this.GetComponent<Button>().onClick.Invoke(); } } This works as intended in the Unity Editor, however, when I build the game, none of the triggers work as they should. I am unsure what the reason could be.1.7KViews1like1CommentHow to display head-locked UI elements over eye buffer on Quest 2?
I want to display small Unity UI elements to the player that can change dynamically. I at first tried to just use a canvas set to screen space but it didn't render to the screen. I tried to use the OVR Overlay except I can't figure out how to make it lock to the player's head. Are there any solutions to the problem I am trying to solve? (Using Oculus Quest 2)5.2KViews0likes1CommentOVRInputModule primary index trigger is unreliable.
Hi. I am having an issue wher the OVRInputModule is not working with me properly when I try to use the Primary Index Trigger. I have UI elements in my game including toggles and sliders. If I use the A button, I am able to interact with these widgets successfully. However, when I use the primary index trigger, it is hit or miss whether or not my input is registered. For the "Joy Pad Click Button", I have "One" and "Primary Index Trigger" selected. I did put some breakpoints in the function virtual protected PointerEventData.FramePressState GetGazeButtonState() and it does seem like the pressed and released variable are toggled on and off properly when I use the A button or the trigger on the right controller. Any ideas? Thanks, John Lawrie2.7KViews0likes2Comments