Recent Discussions
Unexpected Positional Tracking Issues with Meta Quest 2 over Air Link (Research Setup)
We are running a research project using a Meta Quest 2 via Air Link to stream content from a Windows PC. The setup uses a local 5GHz network (no internet access) dedicated to the Air Link connection. The PC itself has internet access, and the Meta Quest Link app is up to date. Our application is a Unity build that has not been changed since data collection began in December 2024. We use only natural movement (e.g. no controller input) and the Guardian is disabled. For the first few months, everything worked reliably. However, for the past ~10 weeks, we've observed increasingly frequent issues with positional tracking. Participants will suddenly "jump" forward or backward by several decimeters, sometimes rotate abruptly, or experience vertical position shifts of up to 80 cm. No physical changes were made to the room or environment. The issue persists across both the original and a newly purchased headset. Since I’ve ruled out the network, room layout, and application itself, I suspect the issue may be caused by recent changes in Air Link or the Meta Quest Link app. Has anyone encountered similar problems in recent months?ZimmerJ7 hours agoHonored Guest0likes2CommentsForcing remote Avatar 2 hands to follow their transform.
Skinning mechanism of Meta Avatars 2 don't seem to play well with precise hand tracking and it seems that quite a bit of function has been sacrificed for form. I get why. Forcing the hands to follow the player's exact location can result in freaky looking avatars - especially with standing avatars from seated players. But some multiplayer gameplay simply requires accurate representation of player's hands, like apps with grabbable objects. It breaks a lot of immersion when you see another player handling an object while the interacting hand is in a completely different location than the object. I'm surprised there doesn't seem to be a quick fix to force the skin's hands to follow their true position, despite the risk of it looking unnatural. What I've tried: Tweaking around with arm blending in OvrAvatarAnimationBehavior didn't do anything Setting the anchoring state to AnchorToHeadset or AnchorToHeadsetDynamicCrouching fixes it, but causes feet to clip in the ground or hover above it, dynamic crouching only works in editor and MR (not VR), head rotations seem frozen, chest rotations are glitchy GPU skinning rather than Computed All kinds of hacky changes in the SDK's code, which is never good even if it works Has anyone found any doable fix?Evrience17 hours agoHonored Guest1like1CommentHow can I run a Unity app as a background process while I play another game on the meta quest 3?
I’m trying to develop a meta quest 3 application that runs in the background so I can record controller data (IMU and button actions) while I’m playing a PCVR game like SkyrimVR. my headset will be connected to my pc with an oculus link.BrandonJ2 days agoHonored Guest1like2CommentsXR Composition Layers are invisible when the MetaXR feature group is enabled
I am using Unity's XR Composition Layers to render a canvas (like a floating desktop) on a Meta Quest 3. The problem I am facing is that the canvas is invisible as soon as the Meta XR Feature Group is enabled. I am using: Unity 6000.1.2f1 Meta XR Core SDK 77.0.0 XR Composition Layers 2.0.0 OpenXR 1.14.3 URP 17.1.0 Unity OpenXR Meta (for passthrough) 2.1.1 The composition layers work well as long as the ‘Meta XR’ Feature Group is disabled in the XR Plugin Management settings for OpenXR. As soon as it is checked, without changing anything else, the composition layer is invisible on device. All my tests on device were done with an actual build, not via Oculus Link, as a build is where is should eventually work. The same issue was posted here on the Unity Discussions Forum. Unity has confirmed that it is caused by a bug in the Meta SDK package, which appears to be disabling certain layers at runtime, which prevents Unity's code from executing properly on the headsets. Is this already being looked into? When can an update be expected? Thank you in advance!Mr.Watts.Quest2 days agoHonored Guest0likes1CommentThe Meta Full body avatars are kind of a nightmare to work with
This post is not a question as much as just general feedback after working with the SDK. I want to gage if other people have had a similar experience. I work on a multi user app that was built originally with half body avatars in mind but with Meta announcing that they would discontinue support for them, we had no choice but to migrate to full body. Our app is very versatile and covers a lot of different movement and sitting types so solving for each individually is not ideal. What I expected: All current VR hardware (with the exception of additional trackers) gives us three fundamental inputs on the position of the user: head + both hands. Logically, all of the avatar systems I have worked with up until this point take that into account. They take a head + two hand positions as input and voila the avatar is now in the right place in relation to the floor. In the case of a sitting avatar, you would probably input another location that defines where the avatars butt should be to sit in the chair. I fully expected the full body avatars to work like this. Having a body and legs adds the extra difficulty of using IK to animate those other parts but I figured Meta had some good algorithms for figuring that out and would provide a simple way of giving inputs and getting an animated avatar as a result. What I got: Meta decided that bifurcating the first and third person avatar animation and restricting the appearance of the third person avatar to remain in realistic positions was more important than the reliability of the position data. The entire system is built around displaying different head and hand positions in third person compared to first person. Crouching does not just match the avatars head to the position of the user, it measures how far down your head is and then plays a crouching animation to a level that mostly lines up. Hand positions are then placed relatively to the head to maintain a normal body structure. This causes a tornado of problems if you want to do anything that does not come pre packaged in the SDK. Even just allowing the user to sit in a chair and freely look around 360 degrees is not included. The user pointing at a specific spot in the environment or at a specific place on another user's body becomes this confusing multiverse conversation about everyone seeing different things. Metas hybrid of IK and normal rigged animations is a nightmare if you want to accommodate more than one thing. Our app allows users to switch between standing, walking, sitting on the floor, or sitting in various chair sizes and shapes seamlessly. We also have a lot of objects that the player can grab and move which track their position independently and synchronize it over the network. The provided sitting behavior that you can find in the LegsNetworkLoopback scene is totally unusable for me. All of the movements of the users head are clamped to stay still in third person, meaning that lots of body language is removed and also objects they may be interacting with appear to float around unconnected to their hands because the first person hand position that the local user sees gets totally altered by the sitting animation. I had to make my own alteration on all of the crouching animations to get a more versatile sitting animation in which the user could actually move their head and be seen doing so. One of the things we rely on to make all of our different seating scenarios work is that we apply offsets within the rig to raise or lower the user and get their head to end up in the right place. You can remain sitting in real life but we will adjust where your head ought to end up in relation to the floor by shifting your play space around. This totally wreaks havoc on the meta system. Just to make it so that the user could transition between virtually sitting and standing without changing positions in real life and KEEP THE AVATAR HEAD IN SYNC with their actual head position was a large undertaking. I think one of the biggest problems is that the rig that applies animations to the avatar does not even match up with the position of the rig in the scene. You have this totally invisible rig off in the middle of nowhere that defines what the avatar will look like for others but does not actually line up with anything in the local scene. There are just so many scenarios in which the third person rendering of the avatar deviates greatly from what the user in first person is actually doing that networking a sensible world in which all users are experiencing the same thing becomes a struggle. We used to have high fives that worked pretty well and now everyone's hands render slightly different in third person and it ruins the feature. Meta have abandoned the one idea that I would have thought to be the most obvious critical feature. In first and third person, the head and hands of the avatar should always match the inputs given by the user. TLDR: Zuckerberg was clearly scared by everyone making fun of the avatars before and so meta ended up sacrificing absolutely everything to put large restrictions on the movement of the third person avatar to keep them from looking silly. For Horizon that's great, for a bunch of apps that were built on a different system and would like to be able to provide the same inputs, it's a nightmare. My request: Please add a normal IK system, where all I do is tell the avatar where my head, hands and butt should end up and it will do the rest. I understand that I'll get some funny VR chat looking stiff movement or stretched limbs from this but at least the position data will be reliable and I won't have to figure out this complicated puppeteering system that only renders for other users.GlimpseGroupDev2 days agoProtege3likes1CommentYellow overlay over certain scene objects
Hi there! We've encountered a strange bug in our Unity build where some scene objects appear with a yellow overlay when running on Meta Quest 3 or 3S devices. This issue does not occur on the Meta Quest 2. The bug appears intermittently—sometimes, after restarting the application, the objects display the correct colors, while other times, the yellow overlay returns. What do you think this could be the problem with? Please let us know if you need more information. Unity Version: 6000.0.51f1 SDK: Open XR v1.14.3 Devices: Can be reproduced on the Meta Quest 3/3S, can not be reproduced on Meta Quest 2 Graphics API: OpenGLES3BidOnGamesStudio3 days agoHonored Guest0likes1CommentPlayer won't move when trying to force teleport via script
Hello everyone, I'm currently stuck on implementing a player teleportation feature for a Meta Quest 3 application in Unity and would greatly appreciate any help from experienced developers. What I want to achieve I want to control the player's position in the VR space. Specifically, I need two main functions: Prohibit player movement at certain times. Programmatically force the player to move to a specific coordinate and orientation (a "teleport"). The Problem I'm unsure of the best practice to achieve this. My attempts have led to issues like the player's position not updating correctly or conflicts with the CharacterController. I want to know if there is a standard, reliable function or method provided by the official SDKs to handle this, which correctly manages the camera offset and physics interactions. What I've tried (from my development log) Initially, I tried to directly modify the position using a custom script, but I suspected a conflict with the CharacterController. I also tried using the CharacterController.Move() method, but that did not solve the issue. My current leading theory is that the root cause is a script execution order conflict within the same frame. Development Environment Unity: 2022.3.22f1 SDK: Meta XR All-in-One SDK (v77.0.0), OpenXR Plugin (v1.10.0) Target Platform: Meta Quest 3 My Question Is my understanding correct that VRChat APIs like Networking.LocalPlayer.TeleportTo and Immobilize will not function correctly, or will conflict with a standard CharacterController, in a standalone Quest environment? If so, I would be very grateful for guidance on the standard, recommended method (e.g., functions or assets) for safely and reliably controlling the player's position (both teleporting and immobilizing) via script in Quest development. (Translated by Gemini AI. JP => EN)SolvedTosiakix3 days agoHonored Guest0likes2CommentsOneGrabRotateTransformer Enquiry
Does the Meta SDK v77 still support OneGrabRotateTransformer? If yes, how should I properly reset it? In my case, when I rotate a door using OneGrabRotateTransformer and leave it open, then click a 'Restart' button that resets the door's rotation to its initial state, the door doesn't behave correctly when I try to interact with it again. It doesn't rotate the same way as it did the first time. How can I fully reset the transformer and allow the door to be grabbed and rotated as expected?anujintellify3 days agoHonored Guest0likes2CommentsPassthrough on windows desktop VR
I'm finding loads of guides for how to get passthrough working on a standalone android build, but what about when running on windows and connecting via a link cable? Is this possible? I haven't been able to get this working on either OpenXR Meta or Meta XR SDK. I understand there might be some security thing preventing the headset from streaming its camera output to a PC... in that case, can I at least call some API to put the headset into passthrough mode so that just the user in the headset sees through the cameras, and not stream this data to the PC? Would like to be able to programmatically allow the user to see through the cameras for a few seconds, and then put them back into the unity scene again Best, FredFredTA5 days agoProtege3likes11CommentsMRUK 77.0 LoadSceneFromDevice() Doesn't Work
I just updated my MRUK package from 69 to 77, and now MRUK.LoadSceneFromDevice() doesn't work. It instead logs an error: Error Unity [XR] [OpenXR Meta] xrLocateSpace failed with result XR_ERROR_HANDLE_INVALID Error Unity [XR] [OpenXR Meta] [AnchorProvider] xrLocateSpace request failed with error: XR_ERROR_HANDLE_INVALID I am using the Meta OpenXR plugin. Note that the MRUK MonoBehaviour's checkbox "Load Scene On Startup" does still work. But the problem with this fallback is after my app launches Space Setup and the user completes it, then returns to my app, the new room the user created will not be available. So the user will have to restart my app.darkveins_7 days agoExplorer1like0Comments