Physical transition between rooms in XR Quest 3 - How to?
Hello, I'm making a mixed reality app, and I have scanned two rooms in my house. I have used the Passthrough, MR Utility Kit and Effect Mesh building blocks, but I can't move between rooms without the screen getting covered up in grey, even though I can see the other room being rendered in the effect mesh. How can I disable this "out-of-bounds" effect in order to transition physically between both rooms? The first image is from the room I started the app, the second is from the second room looking towards the first.59Views0likes1CommentTracked hand pushing away menu (canvas) attached to other hand
I am keeping my left hand almost completely still here in the video demo below, but something pushes the left hand or the menu away when I get the right hand near the other hand/ menu (hard to tell if tracking issue due to other hand being near left hand vs hand interacting with menu or menu item colliders or rigibodies. Video of error in demo Video of unity project setup I have an empty gameobject called 'menu-hand_attatcher' with this script on: using System.Collections; using System.Linq; using UnityEngine; using Oculus.Interaction; // for OVRHand & OVRSkeleton public class AttachMenuToHand : MonoBehaviour { [Tooltip("The disabled Canvas or prefab you want to spawn")] public GameObject menuPrefab; // assign your menu_0 prefab here [Tooltip("Which bone/joint to anchor the menu to")] public OVRSkeleton.BoneId anchorJoint = OVRSkeleton.BoneId.Hand_WristRoot; [Tooltip("Offset from that joint in metres")] public Vector3 localOffset = new Vector3(0f, 0.05f, 0.06f); private OVRHand hand; // we’ll find this automatically private bool attached; private void Awake() { // Auto-find the first active OVRHand in the scene (left or right) hand = FindFirstObjectByType<OVRHand>(); } private IEnumerator Start() { if (hand == null) { Debug.LogError("❌ No OVRHand component found anywhere in the scene!", this); yield break; } // Grab the OVRSkeleton attached to that hand OVRSkeleton skel = hand.GetComponent<OVRSkeleton>(); if (skel == null) { Debug.LogError("❌ OVRHand has no OVRSkeleton component!", this); yield break; } // Wait until the system has tracked the hand & built out all bones while (!hand.IsTracked || skel.Bones.Count == 0) { yield return null; } // Find the specific bone (wrist, index tip, etc.) by BoneId OVRBone targetBone = skel.Bones .FirstOrDefault(b => b.Id == anchorJoint); if (targetBone == null) { Debug.LogError($"❌ Couldn't find bone {anchorJoint} on the skeleton!", this); yield break; } Transform jointTransform = targetBone.Transform; // Spawn or enable the menu prefab GameObject menuInstance = menuPrefab.activeSelf ? menuPrefab : Instantiate(menuPrefab); menuInstance.SetActive(true); menuInstance.transform.SetParent(jointTransform, worldPositionStays: false); menuInstance.transform.localPosition = localOffset; menuInstance.transform.localRotation = Quaternion.identity; attached = true; } private void LateUpdate() { if (attached && Camera.main != null) { // Keep the menu facing the user’s camera transform.LookAt(Camera.main.transform.position, Vector3.up); } } } Anchor joint is set to XR Hand_Start Items in menus have collider + rigidbody (no gravity) to allow acting as trigger It's not so clear in the video but it appears going near the colliders for the buttons freaks it out and flips the menu around, but so does just crossing the right hand over the left (menu anchored to left wrist), or getting it near the canvas. So hard to tell what its interacting with. 1. What can I try using this method I'm already using? (I'm very new to Unity and coding so redoing things or doing them a different way could take me a lot of time and I don't have much) 2. If I were to - is there a better way to do this menu/canvas and hand attachment? Here's my script of my menus using UnityEngine; using UnityEngine.UI; public class MenuNavigator : MonoBehaviour { [Tooltip("Prefab of the submenu to open when this button is poked")] public GameObject submenuPrefab; // debounce flags private bool _hasActivated = false; private bool _ignoreNextTrigger = true; private Transform _canvasRoot; void Awake() { // cache the Canvas transform so we know where our menus live Canvas c = GetComponentInParent<Canvas>(); if (c != null) _canvasRoot = c.transform; else Debug.LogError("[MenuNavigator] No Canvas found!", this); } void OnEnable() { // every time this menu (and its buttons) become active: _ignoreNextTrigger = true; _hasActivated = false; } void OnTriggerEnter(Collider other) { if (_ignoreNextTrigger) return; // ignore the initial overlap if (_hasActivated) return; // and only fire once per exit/enter // 1) Spawn the next submenu prefab under the Canvas if (submenuPrefab != null && _canvasRoot != null) { var newMenu = Instantiate(submenuPrefab, _canvasRoot); newMenu.transform.localPosition = Vector3.zero; newMenu.transform.localRotation = Quaternion.identity; newMenu.transform.localScale = Vector3.one; } else { Debug.LogWarning("[MenuNavigator] Missing submenuPrefab or Canvas!", this); } // 2) Find *your* menu panel: the direct child of the Canvas Transform panel = transform; while (panel != null && panel.parent != _canvasRoot) { panel = panel.parent; } if (panel != null) { Destroy(panel.gameObject); } else { Debug.LogWarning("[MenuNavigator] Couldn't find menu panel root!", this); } _hasActivated = true; } void OnTriggerExit(Collider other) { // 1st exit after spawn turn off ignore, subsequent exits clear the activated flag: if (_ignoreNextTrigger) _ignoreNextTrigger = false; else _hasActivated = false; } } Many thanks :)26Views0likes0CommentsOVR Plugin failure for detection space / importing Room-Data into Unity
(NOT a duplicate post) Someone else posted the same issue on the meta community "get help" forum. The user said he thinks installing "Vive business streaming software" caused the issue to manifest, but I do not / never have installed that, so I believe there is a deeper underlying issue here. Trying to run the MR samples via Link results in this error, and the room scene / objects aren't materializing at all. The PassthroughRelighting demo character immediately falls through the world if you move him at all. (yes, I set up the room scene before starting link) [OVRPlugin] [XRCMD][failure] [XR_ERROR_HANDLE_INVALID]: xrLocateSpace(*(XrSpace*)space, baseSpace, ToXrTime(GetTimeInSeconds()), &spaceLocation), arvr\projects\integrations\OVRPlugin\Src\Util\CompositorOpenXR.cpp:11831 (arvr\projects\integrations\OVRPlugin\Src\Util\CompositorOpenXR.h:302) UnityEngine.Debug:LogWarning (object) OVRManager:OVRPluginLogCallback (OVRPlugin/LogLevel,intptr,int) (at ./Library/PackageCache/com.meta.xr.sdk.core@60.0.0/Scripts/OVRManager.cs:1984) OVRPlugin:TryLocateSpace (ulong,OVRPlugin/TrackingOrigin,OVRPlugin/Posef&,OVRPlugin/SpaceLocationFlags&) (at ./Library/PackageCache/com.meta.xr.sdk.core@60.0.0/Scripts/OVRPlugin.cs:10180) OVRLocatable:TryGetSceneAnchorPose (OVRLocatable/TrackingSpacePose&) (at ./Library/PackageCache/com.meta.xr.sdk.core@60.0.0/Scripts/OVRAnchor/OVRAnchorComponents/OVRLocatable.cs:170) Meta.XR.MRUtilityKit.MRUK/<LoadSceneFromDevice>d__31:MoveNext () (at ./Library/PackageCache/com.meta.xr.mrutilitykit@60.0.0/Core/Scripts/MRUK.cs:368) System.Runtime.CompilerServices.AsyncMethodBuilderCore/MoveNextRunner:Run () OVRTask`1<bool>:SetResult (bool) (at ./Library/PackageCache/com.meta.xr.sdk.core@60.0.0/Scripts/Util/Async/OVRTask.cs:216) OVRManager:UpdateHMDEvents () (at ./Library/PackageCache/com.meta.xr.sdk.core@60.0.0/Scripts/OVRManager.cs:2767) OVRManager:Update () (at ./Library/PackageCache/com.meta.xr.sdk.core@60.0.0/Scripts/OVRManager.cs:2723)811Views0likes1CommentHaptic buffer no longer supported on Quest 3 controllers
Using Unity XR Input, the Quest 3 controllers return supportsBuffer as false, however I was fairly certain that I had seen this returning as true before several months ago and given the supposed haptic capabilities of the Quest 3 controllers I would expect this to be supported. This is with using version 1.7.0 of the Input System package and Unity Engine version 2022.2.18f1.534Views2likes1CommentHow to access Quest 3's camera image (or passthrough image) directly within the Unity app
In developing the app using Unity with Oculus Quest3, we want to get the camera image of Quest 3 within the Unity app and recognize specific objects in that image. 1) Is there any way to get the camera image of Oculus Quest 3, or have direct access to the passthrough image ? 2) There is a feature in Quest 3 that captures the current Mixed Reality Screen and I would like to implement that feature directly within the Unity app. Is there an API on the app that can capture the screen of a Quest 3 camera(or passthrough) and if not currently, is there a plan to support it in the future? Your advice and help are appreciated.2.6KViews0likes1CommentBuilding Blocks Scripts License Question
Hi, I have a question about Building Blocks in Unity. Is it possible to modify them (e.g. change the defaul mesh to my own) in order to create a project for Meta Quest 3. I'm mainly asking if the scripts of these Building Blocks can be modified. It's not entirely clear to me from the License Agreement. For example, 1.2.1 says: You may not modify or create derivative works from any SDK or its component (other than sample source code described in this Section or expressly authorized by the documents accompanying the SDK);640Views0likes0CommentsPassthrough Mask in Unity with Quest 3
I've been trying to figure out how to replace the passthrough video ceiling with a VR scene like in the First Encounters demo or the Discover app on Github. I've tried working with the layers and copying the shaders from the Discover app, but haven't made any progress. Does anyone by chance know how to mask the passthrough video to reveal the VR scene underneath? I would be super grateful for any advice. Thanks!775Views0likes0CommentsUnity Meta XR SDK: Is it possible to trigger a teleport through a ray interaction?
I want my player to be able to point towards an object with a teleport interactable attached, press it (ray press) and get teleported. Currently I can use the teleport locomotion (with controller or hands) to trigger teleport, but is it possible through ray interaction?1.2KViews0likes0CommentsQuest3 - Head rotation problem in Unity play mode
Hi, when connecting Quest3 via airlink to Unity (play mode), head rotations along the Z axis are not tracked. The tracking works ok in unity on desktop, but in the headset, the there is no tilt. It seems the scene is always centered. Eg. a cube will not tilt when I tilt the head. Tracking the head sideways and up/down works. It seems there is a kind of billboard function that prevents tilt rotations. Any suggestions?589Views0likes0Comments