Object placed above MRUK furniture "jumps" up/down when pushing right thumbstick
Context Unity + Meta XR Building Blocks. I'm building an AR app (Passthrough + MR Utility Kit). I show a world-space UI dialog above an MRUK table (Canvas in world space, RectTransform placed at the table center + slight lift to sit on the surface -> all via script). Symptom Whenever I push the right controller thumbstick downward, the dialog appears to "jump" ~0.5 m up, and pushing again makes it jump back down. This happened both on the device and in the Simulator. What it actually is It's not the dialog moving. Logging showed Camera.main.transform.position.y toggling between two values (1.047 <-> 1.547), while the dialog's world Y stayed constant.Solved46Views0likes1CommentHow can I display hands over OVROverlay?
I am using the OVRCanvasMeshRenderer script to display a curved canvas using OVROverlay. Using this for a spatial web browser, and OVROverlay provides the best visual clarity. Problem is that the canvas now displays over everything. I cannot figure out a way to mask the hands to display over the canvas. I read a few posts that suggested enabling XR Depth Submission, and this makes no difference. Is there an example of how to go about this anywhere? Using Meta SDK V 62 on Unity 2022.3.11846Views2likes5CommentsHow to manage ray interactions on multiple Unity Canvases on top of each other?
Hi, How do I manage disabling and enabling multiple world space canvases so the Ray interactions work properly, when the canvases are placed in front of each other? The UI canvases are default Unity world space canvases with ray interactions added with the quick actions described in this article: Add an Interaction with QuickActions | Oculus Developers The problem is that I have two worldspace canvases with ray interactions placed in front of each other, kinda like in layers. When I disable the canvas gameobject in front, the interactor cursor hits the disabled (invisible canvas) and does not properly pass through to hit the second canvas (which it should do). How can I ensure the cursor interacts with the first active/visible canvas and does not get captured by disabled ones?1KViews0likes4Comments[Unity] Can't use controller to select button inside UI scroll
Hello, I'm trying to use Unity UI Scroll with UIButtons inside, but i can't get em to work with the controllers, but its working fine with hands gesture. The problem is with the sensitivity of the button to unselect, like if I can hold the controller without shaking its possible to selected but the normal use, the simple act of pressing the controller you move a little and the MetaQuest only registe the scroll movement and not the click.Solved1.4KViews0likes4CommentsMenu Canvas Not Responsive
Details Unity Version: Set-Up: Meta Integration: Meta XR All-in-One SDK Using Oculus Quest 2 Hello! I am not new to VR development but I am for some reason struggling to build this new app I am currently working on. I have three issues. 1. I am trying to add a Menu/intro scene in my app but the Canvas I have created is not responsive. The UIHelpers are working on hit-target, but the buttons are not. E.g., a button that should bring you to the start scene does not work. I have checked and re-checked the script and all the steps of this tutorial that I used in the past for other projects, but I cannot understand why it isn't working. 2. The Menu Scene is not showing when loading the build - I am automatically redirected to the main scene. It only works when I remove the Main scene from the list of the scenes to load. Any suggestions for this issue? 3. Grabbable props I am using are not showing in the scene. I am following the official Meta guideline to build this app, although it should be simpler to create experiences in software using the Meta XR All in One integration I am finding it extremely frustrating as it is the second time I had to start from scratch. Shall I follow a more traditional approach and follow the Unity VR tutorials or is there anyone who can advise me on how to create a VR app without encountering these many issues? Thank you. MetaStoreHelpSolved1.5KViews0likes1CommentInteraction SDK - Confusion About RayExamples Material Types
The RayExamples scene has Curved surfaces that allow for various material types: Alpha Cutout, Alpha Blended, and Underlay (only viewable on-device) I cannot tell the difference between the `Alpha Cutout` and `Underlay` materials. I'm viewing it on my Meta Quest 3.776Views0likes1CommentUI element with transparent parts in the sprite bleeds into panel
Hello, My simple setup is: - Canvas -- Panel (opaque dark grey) --- Button (unity default button with uisprite with transparent sides, I just darkened the color) The problem is that the transparent parts of the Button sprite also makes the panel transparent. This makes the panel look like it has a transparent border around the button. See image. See the light areas around the buttons, this is actually the background being visible through the panel. I have tried various image import settings. I have tried psd and png format with a square image and only defining transparency with the alpha channel. It still has the transparency issues with the panel. Has anyone any idea how to solve this?823Views0likes1CommentHow to use Texture Coords to Raycast on UI?
Is it possible to use Texture Coords to cast a ray on a canvas UI? I'm extracting the texture coords from a rendertexture and now I want to cast these coords onto a canvas UI. I've been looking through the OVRRaycaster script, but I can't make heads or tails on what kind of parameters it needs to cast a graphic raycast. Any help would be greatly appreciated. Jesper736Views0likes1CommentHow can I make my gameobject appear when I press the UI button on the canvas/menu
Hi forum. I'm fairly new to Unity and coding/scripting. I have a CAD-model in Unity and I can walk through it with my Oculus Rift VR gear. I have created a menu with two UI buttons as a start. The "Quit" button turns red when I hover over it, and when I click it the menu disappears and the gazepointer too.(which is what I planned) The problem is, that when I press the other button a gameobject should appear in front of the player, but nothing happens? The buttons turns green as it should when I hover it and click it, but nothing happens? I tried dragging the gameobject to the buttons OnClick event, and set it as Set.Active etc. Nothing has been working so far. Can anybody help me with this? :) It should be pretty straight forward but I cant get it to work. I tried a few scripts to make the gameobject appear 1 foot from the player in front the player, but this didnt work729Views0likes0CommentsCanvas does not render when outside of OVR CameraRig 'reach'
I have a big tutorial image (it's big, because otherwise text tip on it look ugly) I tried repositioning it in different manners to achieve best quality and at some point it just does not render if the canvas is outside of that OVR CameraRig 'FOV box' or whatever that thing is. How do I change it? Update In Unity's preview all 3 cameras show the canvas just fine, however, when I build it and launch on Oculus Go something black covers the canvas and I have to either look up, down, left or right for a part of canvas to render.736Views0likes0Comments