MRUK: Object not automatically placed on floor in Meta XR SDK 78
Hi everyone, I am working on a Mixed Reality project using Meta XR SDK v78 with MR Utility Kit (MRUK) in Unity 6000.0.55f1 I want to automatically place a 3D machine model on the real-world floor when the scene loads. However, the object stays floating in mid-air instead of being positioned on the detected floor surface. Here’s the code I tried: var room = MRUK.Instance.GetCurrentRoom(); if (room != null) { var anchor = room.GetSurface(MRUKAnchor.SceneLabels.FLOOR); if (anchor != null) { transform.position = anchor.transform.position; transform.rotation = anchor.transform.rotation; } } But, this does not align with the real world in the floor it is in the mid air please let me know to how to fix this. Thanks in advance.29Views0likes2CommentsHow to use a custom QRCode tracking frequency in MRUK
In MRUK (MRUK.Trackers.cs) trackers are updated based on onTrackableUpdated events. private void HandleTrackableUpdated(ref MRUKNativeFuncs.MrukTrackable trackable) { if (_trackables.TryGetValue(trackable.space, out var component) && component) { UpdateTrackableProperties(component, ref trackable); } } Defined in MRUK.Shared.cs. MRUKNativeFuncs.MrukEventListener listener; // ... listener.onTrackableUpdated = OnTrackableUpdated; // ... MRUKNativeFuncs.RegisterEventListener(listener); Which seem to be triggered by native code. RegisterEventListener = MRUKNative.LoadFunction<RegisterEventListenerDelegate>("RegisterEventListener"); Is there any way to edit such code in order to increase the frequency at wich onTrackableUpdated is fired? Will MRUK enable setting tracker update frequency in the future?19Views0likes1CommentQuest System Keyboard Input Box is gone on the new GameActivity entry point Unity 6
Hello, We are encountering a significant issue with system keyboard on Meta Quest builds using Unity 6. The Problem Forced Entry Point: Unity 6 now defaults to the GameActivity application entry point. When we attempt to use the older Activity entry point (required for the previous keyboard system), the application crashes on startup, forcing us to use GameActivity. Missing Input Box: The GameActivity entry point uses the GameActivity Jetpack library, which has removed the on-screen input box that traditionally appears above the system keyboard. For mobile apps, this is a clean design, but in Meta Quest VR, this causes a major UX problem. The UX Issue on Meta Quest Since the Quest system keyboard is visually separate and detached from the in-game UI, users lose all visual confirmation of what they are typing. The visible input box on the system keyboard is necessary in VR to show the user the text they are actively entering. The Constraint Migrating our entire project to a custom virtual keyboard is not feasible due to the complexity of supporting multiple languages and character sets. Our Question Is there an official or known way to restore the input box feature on the system keyboard, as it existed with the old "Activity" entry point, while still using the required GameActivity in Unity 6? Any guidance on modifying the GameActivity bridge or another low-level fix would be greatly appreciated. Thank you!94Views0likes2CommentsMy project runs on Quest 2 at 18 FPS (GPU-bound), no matter the scene or XR Rig
Our game's Quest performance has always hovered around 36 fps while in development, which I've come to find out was due to high batch counts and complex lit material shaders on every object. But now that I'm back to testing optimization on Android, I've noticed terrible frame rates (10-18) no matter what I disable in the scene. I've even opened up the DemoScene from XR Interaction Toolkit, and it experiences the same unknown GPU-bound issue. Context about my project: Unity 6.0.59f2 running on a Windows 11 PC. Open XR project targeting a Quest 2 and SteamVR release. I have a ton of packages from the Unity Asset Store that I'm using to build the rig, environment, or mini games. I've kept them up to date, checked that they were Unity 6 compatible, and removed any unused packages. Things I've checked: I've used the What If analysis from the Quest Runtime Optimizer tool to find out that the one blank scene I was testing in had a total GPU Frame Time of 0.342 ms. Maybe you can help identify anything in the Renderdoc. Every little setting in Project Settings to be set up the recommended way. The only warnings left in my Meta Project Validation are about depth and opaque textures being on, which are required for Stylized Water 3 (an asset that supports Quest) to run. Also, one more warning about Screen Space Ambient Occlusion but from checking everywhere, that feature is off or deleted. Memory Profiling. RAM usage in these empty test scenes is at 1.54 GB and I've sat in the scene for 20 minutes before without it crashing or getting slower like a RAM overflow would do. There is no spamming of logs, warnings, or errors during runtime on the quest. I checked by turning on Script Debugging and connecting the editor's console to the headset's output. All I found were some regular logs on startup. Suspicious things I've noticed that only happen in my Quest builds (testing in-editor lacks these issues and runs above 50 fps): Dynamic Resolution falls to 0 for my game. But I'm curious if this is the reason for making foveated rendering glitch out so much so that the screen becomes super pixelated in vertical strips across the screen. Only when building will I be notified of shader errors from packages we've been setting up like GPU Instancer pro, Microsplat, and InfiniCloud. I mostly stop the errors by deleting sample packages that we didn't need or in one case commenting out a line of code with an error about fog in the universal render pipeline asset or something. Please help, I've worked on this for four days, and I'm at the end of what I know to try.Solved41Views0likes4CommentsMeta XR Simulator does not appear
Pullin my hair out here - I have the V81 simulator installed on Windows 11. Unity 6.2.10f1. I enable the simulator but the window does not appear. Any suggestions on what to check? My goal is to use the headset in editor and the simulator in the Multiplayer play mode. When using XRIT and the unity xr simulator in another project, this type setup works quite well . I can have 3 virtual players using the simulator and I can use the actual headset over the pc link in the editor. I want to replicate this type of setup but using the Meta XR Simulator instead.Solved57Views0likes4Comments[BuildingBlock] Networked Avatar
Hi! I’m having a problem with the [BuildingBlock] Networked Avatar. I want all networked player avatars to only show head and hands - not the full body. For some reason, when playing on Quest every player is always shown with a full body, even though I set Manifestation Flags and Active Manifestation to HeadHands & Active View to FirstPerson. Am i doing something wrong?18Views0likes0CommentsXR Headtracking breaks in build on Meta Quest 3 after headset goes to sleep
1. What happened Recently, we have begun encoutnering an error with our builds of a VR App for the Meta Quest 3 headset. Every now and then, after having our App open on the headset and setting it aside, where it then goes into sleep mode, an unexpected effect will be permenantly applied to the session. When you press the Meta home button on the right controller to bring up the overlay menu, the camera in Unity loses tracking of the headset. The camera gets put into the floor of the scene and the picture does not move along as you move your head around in VR. When you unpause again tracking is regained, but from then on, pausing causes this graphical error consistently. We are uncertain when it began but it is likely related one or both of these changes our project recently went through: - Updating our Unity with the new security patch (We went from Unity 6000.0.51f1 to Unity 6000.0.58f2) - Recent updates to the Meta Quest 3 headset The most likely suspect is the Meta Quest 3 update, as the error is reproducible on older builds made before the security patch. It is additionally not possible on Meta Quest 2 headsets, which supports this idea. We will also be reporting this issue to Meta, as it is unclear if this is an issue that should be solved in Unity's XR systems or in Meta's software. 2. How can we reproduce it using the example you attached I have attached a project built clean from scratch, following Meta's guide found here: https://developers.meta.com/horizon/documentation/unity/unity-project-setup/ Using the attached project, here's a step by step guide how to reach this bug: 1. Make a build for Android with the attached project 2. Load the APK onto a Meta Quest 3 headset 3. Open the app and verify head and hand tracking are working in the scene 4. Set aside the headset for around 30 seconds (this has been the most consistent timing for us to reproduce this bug) 5. Put the headset back on and verify that everything is still working as expected (move your head and hands, pause using the Meta Home Button, etc.) 6. Set aside the headset again for another 30 seconds 7. Put it one once again 8. You should now see that, when you bring up the Meta overlay menu with the Meta Home Button, the camera in Unity gets stuck in the floor and loses tracking. When you unpause, tracking is regained. But from then on, tracking is lost every time you bring up the Meta overlay menu. Additional info: - The bug goes away again if you do another round of setting the headset aside for 30 seconds. Doing so will give you a none buggy session again. Presumably more 30 second cycles will continue to switch between having the error and not having it Link to repo with the test project: https://github.com/PeterrificHC/XRSleepPauseError41Views1like2CommentsHow to achieve recognition of a set of dynamic gesture movements rather than individual actions.
Hello everyone, I’m wondering how to record a complete set of gesture movements for the Quest 3 to recognize—for example, having it detect when I stretch out my palm and wave, rather than only recognizing when I’m holding my palm still in front of me. Could anyone help me with this? Thank you so much. To be honest, the Quest 3’s user gesture interaction experience has left me a bit frustrated. I have to rely on my thumb and index finger for extended periods to perform operations—like making a fist and extending my index finger to click, or pinching the objects or interfaces I want with my thumb and index finger. It was okay at first, but after prolonged use, this puts a significant strain on my fingers. Moreover, it’s not smooth enough to fully integrate the operations into my daily life experience. So I want to make some attempts. I hope the Quest 3’s gesture interaction experience can be as smooth as that of excellent web front-end pages—seamlessly integrating into daily life like the ones shown in AR concept videos, rather than just generating a few virtual screens in front of you. At the very least, it shouldn’t keep straining my index finger anymore; my fingers are really sore. If you share the same thoughts or want to communicate with me, feel free to send an email to luoyiheng2005@outlook.com. I look forward to your emails.28Views0likes1CommentIssue with APK-to-APK Launch on Meta Quest — Unexpected Return to Lobby
Here’s the situation: APK(1) launches APK(2) using an intent — this usually works, but occasionally it returns to the Meta Quest lobby (immersive home)instead of opening APK(2). APK(2) then launches APK(3) — this frequently results in returning to the lobby (immersive home), and APK(3) does not open. All APKs are installed via "Unknown Sources" and are intended to run in VR mode. Tested on Meta Quest 2/3/3s, system version 81. Has anyone experienced similar behavior or found a reliable way to ensure smooth APK-to-APK transitions without falling back to the lobby (immersive home)?15Views1like0CommentsSystem Menu appears on finger pinch – how to disable it?
Hi everyone, I’m developing a standalone VR app for Meta Quest 3 that uses hand tracking (no controllers). However, when users pinch their thumb and index finger, the System Menu unexpectedly appears — interrupting the experience. This seems to be the default system gesture for opening the Meta Menu in hand-tracking mode. For a VR application, this causes a very uncomfortable and distracting experience for users — especially when they accidentally trigger the system UI during normal interaction. I’d like to ask: 1. Is there any official way or API to disable or suppress the system menu gestures (like the pinch or palm-up menu)? 2. Can this be done at runtime through Unity / OpenXR / Meta SDK settings? 3. If not, are there enterprise or MDM settings (e.g., ManageXR, ArborXR, etc.) that allow disabling these gestures? Any confirmed solution or workaround would be greatly appreciated — even a partial one (for example, disabling only the pinch gesture while keeping hand input active). Thanks in advance! Naetib186Views1like10Comments