Keyboard Input Not Working on InputField with Ray Interaction on Canvas (Unity XR/Meta Development)
ello everyone, I'm developing an application in Unity 6 latest version using the Meta XR All-in-One SDK (latest version). I'm encountering a frustrating issue for some timewith an InputField (specifically TMP_InputField) on my UI Canvas. The Problem: I have a UI Canvas set up with a TMP_InputField. I'm using the ray interaction provided by the Meta XR SDK to interact with UI elements on this Canvas. When I use the ray interactor to click on the TMP_InputField, a blinking cursor appears inside it, as expected. However, I am unable to type anything using my physical keyboard. Crucially, if I temporarily disable the ray interaction component on the Canvas (or the OVR Raycaster if that's the one), I can type into the InputField perfectly fine. This strongly suggests a conflict between the Meta XR ray interaction system and standard keyboard input getting routed to the InputField. My Setup Details: Unity Version: Unity 6 XR Setup: Meta XR All-in-One SDK InputField Type: TMP_InputField (TextMeshPro InputField - TMPro.TMP_InputField) Canvas Configuration: I've made the Canvas itself "ray interactable" via right clicking on canvas and adding "ray interaction to canvas". What I've Tried So Far (and related issues): Making Canvas Ray Interactable: Confirmed the Canvas is reachable and clickable via the ray interactor. TMP_InputField "Interactable" Property: The TMP_InputField component's "Interactable" checkbox is checked. Basic Interaction: Clicking with the ray does make the cursor appear, indicating some level of interaction. Request for Help: Why would Ray Interaction on the Canvas prevent standard keyboard input from reaching the TMP_InputField? Is it consuming all input events, or is there a specific Event System configuration needed for coexistence? What's the recommended approach within the Meta XR All-in-One SDK and Unity 6 for handling TMP_InputField keyboard input when using ray interactors? Are there specific settings that need adjustment? Could there be a conflict with the active Input Module in the Event System GameObject? What configuration should I aim for when using Meta XR? Is there a way to temporarily disable the OVR Raycaster (or similar component) when the TMP_InputField is focused, and re-enable it when it loses focus? If so, what is the best way to get a reference to the specific raycaster component? Any insights, debugging tips, or specific configurations for the Meta XR SDK or Event System in Unity 6 that might resolve this would be extremely helpful. Please help me in resolving this issue.116Views0likes3Commentsupdated metaXR-all-in-one xr sdk from v67 to v77. associated script cannot be loaded
I have updated metaXR all in one sdk from version 67 to v77 latest. I'm using unity 2022.3.21f1. Lots of files used in the project is having problem. should i revert to the last used version? What i have done to fix this? after updating i got this issue "OVRProjectConfig exists but could not be loaded. Config values may not be available until restart." first deleted the package cache folder under library. and restarted unity to regenerate those files incase if any package is causing issue. also deleted the OculusProjectConfig.asset file under oculus directory in assets. it got regenerat I restarted the unity editor. now i'm getting this earlier issue can anyone look into this issue. Thanks in advance. tdmowrer any idea?108Views0likes2CommentsController-Driven Hand (Capsense) v66 is broken.
I have been trying to implement Capsense or controller-driven hands for four days, but I have not been successful. Initially, I worked with version 65 of the Meta XR SDK, but nothing seemed to work despite my best efforts. When version 66 was released, I hoped that the Capsense issues were fixed, so I updated to version 66 and imported the samples from the Meta XR Core SDK. Interestingly, even the sample scene "ControllerDrivenHandPoses" provided by the developers does not work correctly-the hands do not appear. My goal is to use controllers while displaying hands and their animations simultaneously. Currently, when I use the controller, my hands disappear. Even if I set the hands to always show, they appear but the controllers do not work to move the player. I am seeking help to resolve this problem. If anyone knows the solution, please let me know.2.5KViews0likes9CommentsErrors After Adding the Meta Avatars SDK Sample Assets (v29.7.0) to Clean Project
I am getting the following errors when trying to add the new Meta Avatars SDK Sample Assets to a clean project using Unity 2022.3.22f1: Assets\Samples\Meta Avatars SDK\29.7.0\Sample Scenes\Scripts\UI\UILogger.cs(367,17): error CS0246: The type or namespace name 'UIInputControllerButton' could not be found (are you missing a using directive or an assembly reference?) Faild to find entry-points: Assets\Samples\Meta Avatars SDK\29.7.0\Sample Scenes\Scripts\UI\UILogger.cs(367,17): error CS0246: The type or namespace name 'UIInputControllerButton' could not be found (are you missing a using directive or an assembly reference?) What steps can I take to resolve errors?1KViews2likes4CommentsEmpty Environment Path - Package Path Location Blank When Updating to v.69
I seem to be able to update to the latest version of Core Meta SDK (v.69) just fine, but then after the Unity Editor restarts, the path to package cache is wiped and the Unity console endless prints the error below. Has anyone else experienced this and if so what was the solution? I am going to try adding an environment variable to see if that has any effect, but any help or feedback is appreciated. There was a path set here prior to updating but it is blank with no option to add a path after updating and restarting the editor.259Views0likes0CommentsEverything breaks after updating SDK v64 to v67
I have been having problems implementing passthrough, so I thought that updating the sdk version would help. The core and interaction sdk's update just fine, but the all-in-one completely breaks the player controller and hands. The scripts turn into (Script) and when I try to reattach them it gives me an error popup so I can't fix it myself. It seems other people have had this same problem but I haven't found any solutions yet.889Views0likes2CommentsMenu Canvas Not Responsive
Details Unity Version: Set-Up: Meta Integration: Meta XR All-in-One SDK Using Oculus Quest 2 Hello! I am not new to VR development but I am for some reason struggling to build this new app I am currently working on. I have three issues. 1. I am trying to add a Menu/intro scene in my app but the Canvas I have created is not responsive. The UIHelpers are working on hit-target, but the buttons are not. E.g., a button that should bring you to the start scene does not work. I have checked and re-checked the script and all the steps of this tutorial that I used in the past for other projects, but I cannot understand why it isn't working. 2. The Menu Scene is not showing when loading the build - I am automatically redirected to the main scene. It only works when I remove the Main scene from the list of the scenes to load. Any suggestions for this issue? 3. Grabbable props I am using are not showing in the scene. I am following the official Meta guideline to build this app, although it should be simpler to create experiences in software using the Meta XR All in One integration I am finding it extremely frustrating as it is the second time I had to start from scratch. Shall I follow a more traditional approach and follow the Unity VR tutorials or is there anyone who can advise me on how to create a VR app without encountering these many issues? Thank you. MetaStoreHelpSolved1.5KViews0likes1CommentOculus Quest 2, Meta XR All-In-One Issue
Hi. I am back developing in VR after a year and I am finding it difficult to set up my project in Unity with the new updates. For example, I am following the official guide on the meta website here but I can't even seem to find the OVR camera Rig? Can someone advise me what is the best procedure to follow or if you have any good learning materials, either website or YouTube channels, to suggest? Thank you.1.8KViews0likes2CommentsHand Tracking Gesture for Teleporting with Meta XR AIO SDK
So today I started a sample project, there's going to be a couple of object interactions with Hand Tracking with the new SDK which is Meta XR AIO SDK. Everything works fine until I think about movements with Hand Tracking. so I know Oculus Integration released teleport locomotion with hand gestures, and I searched on the internet how to do that, but didn't find one, even on the Meta XR Interactions Samples, there's only an FBX file that belongs to a scene with the hand tracking gestures on it. but I don't know how to use it, does someone know how to do it? or get any tutorial or scene for a references?Solved2.4KViews0likes2Comments