Meta XR Simulator Synthetic Environment Server rooms are missing
Hi im pretty new to Unity development with the Meta XR SDK. Im on Unity 6.2 and using Meta All-in-One SDK version 78. I've been trying to figure out how to launch a Synthetic Environment Server on the Meta XR Simulator and everytime I try to enter play mode using the Simulator and launch a room from the list of rooms, Unity fails to find the .exe file in the directory path above. And sometimes the editor would just crash when I press play. I tried removing and reinstalling the packages but it just won't work. Any solutions or am I missing something? Thanks176Views1like2CommentsColocation and Shared Spatial Anchors result in wrong frame of reference
Hello everyone! I'm developing a networked experience using Unity 6 with Netcode for Gameobjects and the MetaXR SDK. I am trying to get Shared Spatial Anchors working using Colocation so that the players can see the same frame of reference in Mixed Reality. I managed to get Colocation and Shared Spatial Anchor sharing to work but when I try to align my player rig using the Shared Spatial Anchor from the host there is a misalignment in the scene. No obvious errors or warning are being thrown,I am using the code in the documentation as well as a video tutorial from "XR Dev Rob - Colocation with Meta’s Shared Spatial Anchors & the new Colocation Discovery API" as my main point of reference for my own code. Below is my alignment code if (await unboundAnchor.LocalizeAsync()) { Debug.Log($"Anchor localized successfully, UUID: {unboundAnchor.Uuid}"); var anchorGameObject = GameObject.Instantiate(_anchorPrefab); var spatialAnchor = anchorGameObject.GetOrAddComponent<OVRSpatialAnchor>(); anchorGameObject.name = $"Anchor_{unboundAnchor.Uuid}"; unboundAnchor.BindTo(spatialAnchor); _alignmentManager.AlignUserToAnchor(spatialAnchor); return; } With AlignUserToAnchor(OVRAnchor anchor public void AlignUserToAnchor(OVRSpatialAnchor anchor) { if (anchor == null || anchor.Localized == false) { Debug.LogError("Anchor is not localized yet."); return; } StartCoroutine(AlignmentCoroutine(anchor)); } private IEnumerator AlignmentCoroutine(OVRSpatialAnchor anchor) { var anchorTransform = anchor.transform; for (int alignmentCount = 2; alignmentCount > 0; alignmentCount--) { _cameraRigTransform.position = Vector3.zero; _cameraRigTransform.eulerAngles = Vector3.zero; yield return null; Vector3 offset = anchorTransform.InverseTransformPoint(Vector3.zero); Quaternion inverseYaw = Quaternion.Euler(0f, -anchorTransform.eulerAngles.y, 0f); _cameraRigTransform.position = offset; _cameraRigTransform.rotation = inverseYaw; Debug.Log($"Aligned camera rig position: {_cameraRigTransform.position}, rotation {_cameraRigTransform.eulerAngles}"); yield return new WaitForEndOfFrame(); } Debug.Log($"Alignment complete"); } } I am wondering if I am missing something that needs to be initialized to get the spatial data in order so that the Anchors can get localized correctly. Or if it is something else that I am missing. Thank you for your help! Kind regardsSolved170Views0likes1CommentObject placed above MRUK furniture "jumps" up/down when pushing right thumbstick
Context Unity + Meta XR Building Blocks. I'm building an AR app (Passthrough + MR Utility Kit). I show a world-space UI dialog above an MRUK table (Canvas in world space, RectTransform placed at the table center + slight lift to sit on the surface -> all via script). Symptom Whenever I push the right controller thumbstick downward, the dialog appears to "jump" ~0.5 m up, and pushing again makes it jump back down. This happened both on the device and in the Simulator. What it actually is It's not the dialog moving. Logging showed Camera.main.transform.position.y toggling between two values (1.047 <-> 1.547), while the dialog's world Y stayed constant.Solved57Views0likes1CommentKeyboard Input Not Working on InputField with Ray Interaction on Canvas (Unity XR/Meta Development)
ello everyone, I'm developing an application in Unity 6 latest version using the Meta XR All-in-One SDK (latest version). I'm encountering a frustrating issue for some timewith an InputField (specifically TMP_InputField) on my UI Canvas. The Problem: I have a UI Canvas set up with a TMP_InputField. I'm using the ray interaction provided by the Meta XR SDK to interact with UI elements on this Canvas. When I use the ray interactor to click on the TMP_InputField, a blinking cursor appears inside it, as expected. However, I am unable to type anything using my physical keyboard. Crucially, if I temporarily disable the ray interaction component on the Canvas (or the OVR Raycaster if that's the one), I can type into the InputField perfectly fine. This strongly suggests a conflict between the Meta XR ray interaction system and standard keyboard input getting routed to the InputField. My Setup Details: Unity Version: Unity 6 XR Setup: Meta XR All-in-One SDK InputField Type: TMP_InputField (TextMeshPro InputField - TMPro.TMP_InputField) Canvas Configuration: I've made the Canvas itself "ray interactable" via right clicking on canvas and adding "ray interaction to canvas". What I've Tried So Far (and related issues): Making Canvas Ray Interactable: Confirmed the Canvas is reachable and clickable via the ray interactor. TMP_InputField "Interactable" Property: The TMP_InputField component's "Interactable" checkbox is checked. Basic Interaction: Clicking with the ray does make the cursor appear, indicating some level of interaction. Request for Help: Why would Ray Interaction on the Canvas prevent standard keyboard input from reaching the TMP_InputField? Is it consuming all input events, or is there a specific Event System configuration needed for coexistence? What's the recommended approach within the Meta XR All-in-One SDK and Unity 6 for handling TMP_InputField keyboard input when using ray interactors? Are there specific settings that need adjustment? Could there be a conflict with the active Input Module in the Event System GameObject? What configuration should I aim for when using Meta XR? Is there a way to temporarily disable the OVR Raycaster (or similar component) when the TMP_InputField is focused, and re-enable it when it loses focus? If so, what is the best way to get a reference to the specific raycaster component? Any insights, debugging tips, or specific configurations for the Meta XR SDK or Event System in Unity 6 that might resolve this would be extremely helpful. Please help me in resolving this issue.267Views0likes3CommentsUnity 6.1 + Meta Quest + Building Blocks + Simple Interactable
Probably so simple... but I've been stopped here for days... Steps so far: I start a new project with Unity 6.1 with template "Mixed Reality (MR) core" Switched to Android platform on build profiles Imported Meta XR all in one package Imported Meta MR Utility Kit package Removed all items on my sample scene Executed Meta XR Tools -> project setup tool Selected OpenXR Plugin on the XR plugin management Added Building Blocks "Camera Rig" + "Passthrough" + "Controller Tracking" + "Ray Interaction" Placed a game object (mesh + material) on the scene and added a collider to it and a "Ray Interactable" script Up to this I run and everything goes right: my object appears on the scene, the passthrough works well and I can use my hands or the controllers to display a ray that impacts the object with the required cursor. What I cannot reach is to detect ray events on the object: I would like to change the color of the object material when the ray enters into the object or to make the object dissapear if I press the trigger on the controller when the cursor is on it (I guess the 'select' or 'activate' event, I tried both). So simple... but so hard to achieve... I have tried to add a "XR Simple Interactable" script the same level as the "Ray Interactable", but does not work. Is there anything on my project config wrong? If I use the "Project Validation" on the XR Plugin Management there are no error nor warnings. Is there anyone who can give me some light on it? maybe any free o paid tutorial on how to configure the environment for this case? Many thanks in advance!194Views0likes1CommentMatchmaking with Epic Online Services
Hello wise people, In our Unity 6 multiplayer project, we use Normcore + Matchmaking from the Epic Online Services. When connecting users to a multiplayer game with each other using Matchmaking, and under certain circumstances that are not fully understood, so-called "free sessions" (ghost sessions without the players) can sometimes be created on the server. This usually happens when players leave the room because of unknown reason. The session doesn't immediately close when the room is empty, but it still has a duration and a broadcast period (broadcast time: about 5 minutes, duration: about 10 minutes). This could be manually triggered during testing by disconnecting the internet or crashing the project from the Unity editor. Matchmaking works correctly until these ghost sessions occur. Once they occur, no player can join any newly created sessions. Only after waiting a full 10 minutes, when all ghost sessions have disappeared, everything return to normal. Has anyone encountered this problem, and if so, how did you resolve it? The Epic Online Services SDK is limited in its settings for Unity.52Views1like1CommentBlack Skybox in Play Mode when using META XR SDK v77 [Building Block Camera Rig]
In META XR SDK version 77, when adding the [BuildingBlock] Camera Rig, there is a configuration issue that causes the Skybox to appear black when entering Play Mode. Steps to Reproduce: Create a new project with META XR SDK v77. Add the [BuildingBlock] Camera Rig to the scene. Press Play. Observe that the Skybox is black instead of rendering normally. Cause: The Camera components inside the Camera Rig are set with the Background Type set to Solid Color (black) instead of Skybox... So set it to Skybox Solution / Workaround: In the Hierarchy, locate and expand [BuildingBlock] Camera Rig. Expand TrackingSpace, then CenterEyeAnchor. Select the Camera component, expand the Environment section. Change Background Type to Skybox. (If necessary) Repeat the process for LeftEyeAnchor and RightEyeAnchor Notes: It’s surprising that in v77 this detail was overlooked, considering that in previous META examples they used a different Camera Rig configuration without this problem. The issue seems to stem from the updated default configuration of [BuildingBlock] Camera Rig in v77.53Views1like0Comments