Forum Widgets
Recent Discussions
Object placed above MRUK furniture "jumps" up/down when pushing right thumbstick
Context Unity + Meta XR Building Blocks. I'm building an AR app (Passthrough + MR Utility Kit). I show a world-space UI dialog above an MRUK table (Canvas in world space, RectTransform placed at the table center + slight lift to sit on the surface -> all via script). Symptom Whenever I push the right controller thumbstick downward, the dialog appears to "jump" ~0.5 m up, and pushing again makes it jump back down. This happened both on the device and in the Simulator. What it actually is It's not the dialog moving. Logging showed Camera.main.transform.position.y toggling between two values (1.047 <-> 1.547), while the dialog's world Y stayed constant.SolvedEvi.Brenner4 months agoHonored Guest44Views0likes1CommentDistance Grab - Fails after 1 attempt
Hi, I'm getting a problem with the distance grab function using the Meta Interaction SDK. Using the Quick Actions, I am adding the utility to my grabbable object. When I test the grab, it works fine the first time. However after the first 'pinch', the second pinch returns the console error 'Setting linear velocity of a kinematic body is not supported'. The resulting behaviour of the grabbable object is that it moves only halfway to my pinching hand and then stops. I'm using the built in render pipeline and the OculusXR plugin. Any help would be great!SolvedTransitTech_20254 months agoHonored Guest43Views0likes1CommentColocation and Shared Spatial Anchors result in wrong frame of reference
Hello everyone! I'm developing a networked experience using Unity 6 with Netcode for Gameobjects and the MetaXR SDK. I am trying to get Shared Spatial Anchors working using Colocation so that the players can see the same frame of reference in Mixed Reality. I managed to get Colocation and Shared Spatial Anchor sharing to work but when I try to align my player rig using the Shared Spatial Anchor from the host there is a misalignment in the scene. No obvious errors or warning are being thrown,I am using the code in the documentation as well as a video tutorial from "XR Dev Rob - Colocation with Meta’s Shared Spatial Anchors & the new Colocation Discovery API" as my main point of reference for my own code. Below is my alignment code if (await unboundAnchor.LocalizeAsync()) { Debug.Log($"Anchor localized successfully, UUID: {unboundAnchor.Uuid}"); var anchorGameObject = GameObject.Instantiate(_anchorPrefab); var spatialAnchor = anchorGameObject.GetOrAddComponent<OVRSpatialAnchor>(); anchorGameObject.name = $"Anchor_{unboundAnchor.Uuid}"; unboundAnchor.BindTo(spatialAnchor); _alignmentManager.AlignUserToAnchor(spatialAnchor); return; } With AlignUserToAnchor(OVRAnchor anchor public void AlignUserToAnchor(OVRSpatialAnchor anchor) { if (anchor == null || anchor.Localized == false) { Debug.LogError("Anchor is not localized yet."); return; } StartCoroutine(AlignmentCoroutine(anchor)); } private IEnumerator AlignmentCoroutine(OVRSpatialAnchor anchor) { var anchorTransform = anchor.transform; for (int alignmentCount = 2; alignmentCount > 0; alignmentCount--) { _cameraRigTransform.position = Vector3.zero; _cameraRigTransform.eulerAngles = Vector3.zero; yield return null; Vector3 offset = anchorTransform.InverseTransformPoint(Vector3.zero); Quaternion inverseYaw = Quaternion.Euler(0f, -anchorTransform.eulerAngles.y, 0f); _cameraRigTransform.position = offset; _cameraRigTransform.rotation = inverseYaw; Debug.Log($"Aligned camera rig position: {_cameraRigTransform.position}, rotation {_cameraRigTransform.eulerAngles}"); yield return new WaitForEndOfFrame(); } Debug.Log($"Alignment complete"); } } I am wondering if I am missing something that needs to be initialized to get the spatial data in order so that the Anchors can get localized correctly. Or if it is something else that I am missing. Thank you for your help! Kind regardsSolvedRhinoxDev4 months agoExplorer94Views0likes1CommentScene Change interaction issues
I'm very new to this world and learning curve is real. I've been teaching myself Unity all summer for an art exhibit in two weeks where I plan to have this VR project. It's close, but I can't seem to get the controllers to interact with the UI Buttons and change scenes. I have four UI canvases (each with a UI button child) that will connect to the 4 different scenes. I started with Unity's XR Interaction Toolkit and switched to Meta Interaction and using Building blocks hoping it would simplify the process. I've read so many different ways to get the ray to change scenes, but so far it isn't working. At this point, I think I have too many components on my game objects so I'm not sure how to troubleshoot. On the one UI Canvas I'm testing, I have graphic raycaster, tracked device graphic raycaster, pointable canvas. On the UI Button I have a button component with OnClick set to the scene via SceneLoader script (which was working with mouseclick, but now its not), Pointable Unity Event Wrapper with When Release set to the scene it should go to, and Ray Interactable. On the Event System I have XR UI Input Module and Pointable Canvas Module. I'm also not sure it isn't something I'm missing on the Controller Tracking Settings. I also added XR Ray Interactor on the Building Block GameObjects for left and right. At this point I'd be happy to start from scratch on the UI scene with new UI Canvas GameObjects if it means getting this to work, but I need to understand the most streamlined process to take first. I'd be very grateful for guidance. Can anyone help?Solvedtaciejones4 months agoExplorer94Views0likes9CommentsInstant Content Placement Rotation is slight
Issue: Whenever I place an object on the wall or the floor the rotation is slightly off. Example: I point at my floor, hit the index trigger to spawn Instantiate the prefab, however its rotation is not 90° and it is slightly tilted. Is there any solution for that, which does not require a self written placement script?SolvedSTAcxa5 months agoExplorer41Views0likes2CommentsIs there a OVRCameraRig without all the bloat in latest Interaction SDK?
For quick iterations and prototyping, I depend on the building blocks version of the OVRCameraRig, which in the past used to be so that depending on your needs, you could add it to your scene with one click and then customize it according to your needs (i.e. passthrough or physics grab etc) Now, there is one bloated version that comes with EVERYTHING including a locomotion controller that adds things like vignette and the setup has so many complext dependencies, you cannot just delete/disable the relevant parts under interactors in the hierarchy. Is there a bloat free version like in the past, where I can simply add a hand grab functionality without adding 8 different interactors (simultaneous hand + controller etc)? Working with Meta Interaction SDK V77 in Unity 6.SolvedIseldiera5 months agoProtege85Views0likes6CommentsFacing error using NetworkedAvatar (38.0.1) with Custom Matchmaking
I am facing an error when adding [Multiplayer Block] NetworkedAvatar (ver. 38.0.1) into the scene. The error is associated with a script which seems to be missing from package. The error is triggered in AvatarEntitiy.cs (line 173). var animationBehavior = GetComponent<OvrAvatarAnimationBehavior>(); animationBehavior.enabled = true; I also tried installing 35.2.0 but getting other errors. I believe the AvatarEntity code is built on some older version of AvatarSDK which earlier had class OvrAvatarAnimationBehaviour.Solvedxrdev20235 months agoExplorer79Views2likes5CommentsUnity Native Texture Not Working in Pass-through Mode
I have managed to replicate this repo to show native texture in VR mode. However, when I switch it to pass-through mode, the texture is not showing. I have dived into it a bit, and found that that's because of the conflict of the pipeline, which makes the native plugin not initialized. I am aware of the recently released Unity-PassthroughCameraAPISamples, but what I was trying to do was showing texture that is not from the local camera. I also knew WebCamTexture, but I wonder if it is possible to show texture not from the local camera. Everything works well in VR mode, here is the log when switching to pass-through mode. 2025-07-23 16:42:12.288 2195-2218 BufferQueueProducer com.oculus.vrshell E [SurfaceTexture-115-2195-274](id:893000001ad,api:1,p:1137,c:2195) dequeueBuffer: BufferQueue has been abandoned 2025-07-23 16:42:12.288 1137-1137 CompositionEngine surfaceflinger E ANativeWindow::dequeueBuffer failed for display [com.oculus.android_panel_app.AndroidPanelLayer-com.oculus.systemux-mr_ed] with error: -19 2025-07-23 16:42:12.288 1137-1137 CompositionEngine surfaceflinger W Dequeuing buffer for display [com.oculus.android_panel_app.AndroidPanelLayer-com.oculus.systemux-mr_ed] failed, bailing out of client composition for this frame 2025-07-23 16:42:12.288 2195-20352 BufferQueueProducer com.oculus.vrshell E [SurfaceTexture-115-2195-274](id:893000001ad,api:1,p:1137,c:2195) dequeueBuffer: BufferQueue has been abandoned 2025-07-23 16:42:12.288 1137-1137 CompositionEngine surfaceflinger E ANativeWindow::dequeueBuffer failed for display [com.oculus.android_panel_app.AndroidPanelLayer-com.oculus.systemux-mr_ed] with error: -19 2025-07-23 16:42:12.288 1137-1137 CompositionEngine surfaceflinger W Dequeuing buffer for display [com.oculus.android_panel_app.AndroidPanelLayer-com.oculus.systemux-mr_ed] failed, bailing out of client composition for this frame 2025-07-23 16:42:12.295 21666-21798 BitrateCalculator com...pany.NativeRenderTextureUnity D accumulate (0xb4000077dabf1690): 522 22365833810 Thanks,SolvedLuffyYu5 months agoStart Member84Views0likes2CommentsPlayer won't move when trying to force teleport via script
Hello everyone, I'm currently stuck on implementing a player teleportation feature for a Meta Quest 3 application in Unity and would greatly appreciate any help from experienced developers. What I want to achieve I want to control the player's position in the VR space. Specifically, I need two main functions: Prohibit player movement at certain times. Programmatically force the player to move to a specific coordinate and orientation (a "teleport"). The Problem I'm unsure of the best practice to achieve this. My attempts have led to issues like the player's position not updating correctly or conflicts with the CharacterController. I want to know if there is a standard, reliable function or method provided by the official SDKs to handle this, which correctly manages the camera offset and physics interactions. What I've tried (from my development log) Initially, I tried to directly modify the position using a custom script, but I suspected a conflict with the CharacterController. I also tried using the CharacterController.Move() method, but that did not solve the issue. My current leading theory is that the root cause is a script execution order conflict within the same frame. Development Environment Unity: 2022.3.22f1 SDK: Meta XR All-in-One SDK (v77.0.0), OpenXR Plugin (v1.10.0) Target Platform: Meta Quest 3 My Question Is my understanding correct that VRChat APIs like Networking.LocalPlayer.TeleportTo and Immobilize will not function correctly, or will conflict with a standard CharacterController, in a standalone Quest environment? If so, I would be very grateful for guidance on the standard, recommended method (e.g., functions or assets) for safely and reliably controlling the player's position (both teleporting and immobilizing) via script in Quest development. (Translated by Gemini AI. JP => EN)SolvedTosiakix6 months agoHonored Guest263Views0likes2CommentsHow to render UI over 3D mesh and controllers on top of UI
What I want is for the UI to be visible over all 3D geometry (ignoring depth) BUT at the same time have the line-casts and the controllers visible over that UI. [ UI Panel ] | [3D World Mesh] | o------ [Hands or controllers] With underlay the UI Panel is not visible, culled by the 3D Mesh. With overlay, the controllers and lines are covered by the UI. My solution, but awful for performance: Using URP stacked cameras, the UI and controllers are rendered over everything. Again performance tanks with this method but it's exactly what I want. Possible solution: Underlay transparent but ignoring depth... hard to test as it requires making a custom shader. I can set controllers to 4000 render queue and transparent and they do render over everything.SolvedEudenOne6 months agoExplorer325Views0likes1Comment
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device