Recent Discussions
XR Composition Layers are invisible when the MetaXR feature group is enabled
I am using Unity's XR Composition Layers to render a canvas (like a floating desktop) on a Meta Quest 3. The problem I am facing is that the canvas is invisible as soon as the Meta XR Feature Group is enabled. I am using: Unity 6000.1.2f1 Meta XR Core SDK 77.0.0 XR Composition Layers 2.0.0 OpenXR 1.14.3 URP 17.1.0 Unity OpenXR Meta (for passthrough) 2.1.1 The composition layers work well as long as the ‘Meta XR’ Feature Group is disabled in the XR Plugin Management settings for OpenXR. As soon as it is checked, without changing anything else, the composition layer is invisible on device. All my tests on device were done with an actual build, not via Oculus Link, as a build is where is should eventually work. The same issue was posted here on the Unity Discussions Forum. Unity has confirmed that it is caused by a bug in the Meta SDK package, which appears to be disabling certain layers at runtime, which prevents Unity's code from executing properly on the headsets. Is this already being looked into? When can an update be expected? Thank you in advance!Mr.Watts.Quest8 hours agoHonored Guest0likes2CommentsV71 Handtracking offset with Passthrough enabled
V71 seems to introduce a bug with handtracking that hands in the camera feed (and thus also the rendered model) are not aligned with your real hands when passthrough is enabled. You can reproduce this e.g. in "Demeo" when you switch between AR and non-AR mode that you're hands are "jumping" to the correct position as soon as Passthrough is disabled. Could be related to the changes to eliminate the wrapping of your hands in the camera feed. Strangely the PointerPose property of OVRHand seems not to be affected.scheichs12 hours agoExplorer2likes2CommentsUnexpected Positional Tracking Issues with Meta Quest 2 over Air Link (Research Setup)
We are running a research project using a Meta Quest 2 via Air Link to stream content from a Windows PC. The setup uses a local 5GHz network (no internet access) dedicated to the Air Link connection. The PC itself has internet access, and the Meta Quest Link app is up to date. Our application is a Unity build that has not been changed since data collection began in December 2024. We use only natural movement (e.g. no controller input) and the Guardian is disabled. For the first few months, everything worked reliably. However, for the past ~10 weeks, we've observed increasingly frequent issues with positional tracking. Participants will suddenly "jump" forward or backward by several decimeters, sometimes rotate abruptly, or experience vertical position shifts of up to 80 cm. No physical changes were made to the room or environment. The issue persists across both the original and a newly purchased headset. Since I’ve ruled out the network, room layout, and application itself, I suspect the issue may be caused by recent changes in Air Link or the Meta Quest Link app. Has anyone encountered similar problems in recent months?ZimmerJ17 hours agoHonored Guest0likes3CommentsOpenXR HMD Track Input Error about the meta package of Unity
Phenomenon: After resetting the view, launch this game The Splash Image generates with incorrect Transform, then automatically moves to the center of the view Real-time detection of XRHMD's centerEyePosition and centerEyeRotation in InputSystem shows abnormal Raw Vector3 and Raw Quaternion values When resetting the view in-game via the meta button, XRHMD's centerEyePosition and centerEyeRotation in InputSystem return to normal values Devices and Versions: Consistently reproducible on Quest 2, Quest 3s, and Quest 3. (Suspected to be related to Meta device updates, as the April build did not exhibit this issue initially, but now the same problem occurs even when redownloading that build) Details: This offset error is not random—the same offset occurs every time the game is launched. (Resetting the view before launching the game or resetting in-game still carries this offset to the next session, and even adb uninstall does not resolve it.) The issue has not been observed in other games, but we have exhausted troubleshooting directions and cannot determine why this is happening. The bug cannot be reproduced when using PC streaming, making debugging extremely challenging. However, the issue is 100% reproducible when launching the APK directly on Meta devices. We would greatly appreciate any potential solutions!fxd.look20 hours agoHonored Guest0likes0CommentsMeta XR Core SDK v74.0.2 Build Errors for PCVR
We have a PCVR version of our app and need to build on Windows Standalone, but when we try to build with the Meta XR Core SDK installed we get the following errors: ``` Library\PackageCache\com.meta.xr.sdk.core@2d7964560fb1\Scripts\EnvironmentDepth\DepthProviderOpenXR.cs(31,38): error CS0234: The type or namespace name 'Meta' does not exist in the namespace 'UnityEngine.XR.OpenXR.Features' (are you missing an assembly reference?) Library\PackageCache\com.meta.xr.sdk.core@2d7964560fb1\Scripts\EnvironmentDepth\DepthProviderOpenXR.cs(40,26): error CS0246: The type or namespace name 'MetaOpenXROcclusionSubsystem' could not be found (are you missing a using directive or an assembly reference?) ``` It'd be great to be able to support this use case, thanks. btw I did some quick edits and embedded this package locally and was fixed with the following modifications: https://github.com/oculus-samples/Unity-DepthAPI/issues/783likes1CommentMRUK issues after last update, please help!
Hello, we are developing an app in Unity 3D that, in one scene, uses MRUK. In passthrough mode, the app places four screens on the walls of the room and another object in the center. We are currently using the Meta library version 74. Since the recent OS update, the app no longer works correctly. When entering the room, the app prompts us to update the room (controller required). If we accept, it completes the update but the scene remains empty. If we cancel, the screen turns black, an alarm-like sound starts playing, and we are forced to reboot the device. Could you please help us understand what's going wrong or let us know if there's a compatibility issue with the latest OS? We have a demo tomorrow... 😐 Thank you in advance. Best regards,massimomagrinilucca2 days agoHonored Guest0likes1CommentForcing remote Avatar 2 hands to follow their transform.
Skinning mechanism of Meta Avatars 2 don't seem to play well with precise hand tracking and it seems that quite a bit of function has been sacrificed for form. I get why. Forcing the hands to follow the player's exact location can result in freaky looking avatars - especially with standing avatars from seated players. But some multiplayer gameplay simply requires accurate representation of player's hands, like apps with grabbable objects. It breaks a lot of immersion when you see another player handling an object while the interacting hand is in a completely different location than the object. I'm surprised there doesn't seem to be a quick fix to force the skin's hands to follow their true position, despite the risk of it looking unnatural. What I've tried: Tweaking around with arm blending in OvrAvatarAnimationBehavior didn't do anything Setting the anchoring state to AnchorToHeadset or AnchorToHeadsetDynamicCrouching fixes it, but causes feet to clip in the ground or hover above it, dynamic crouching only works in editor and MR (not VR), head rotations seem frozen, chest rotations are glitchy GPU skinning rather than Computed All kinds of hacky changes in the SDK's code, which is never good even if it works Has anyone found any doable fix?Evrience3 days agoHonored Guest1like1CommentHow can I run a Unity app as a background process while I play another game on the meta quest 3?
I’m trying to develop a meta quest 3 application that runs in the background so I can record controller data (IMU and button actions) while I’m playing a PCVR game like SkyrimVR. my headset will be connected to my pc with an oculus link.BrandonJ3 days agoHonored Guest1like2CommentsThe Meta Full body avatars are kind of a nightmare to work with
This post is not a question as much as just general feedback after working with the SDK. I want to gage if other people have had a similar experience. I work on a multi user app that was built originally with half body avatars in mind but with Meta announcing that they would discontinue support for them, we had no choice but to migrate to full body. Our app is very versatile and covers a lot of different movement and sitting types so solving for each individually is not ideal. What I expected: All current VR hardware (with the exception of additional trackers) gives us three fundamental inputs on the position of the user: head + both hands. Logically, all of the avatar systems I have worked with up until this point take that into account. They take a head + two hand positions as input and voila the avatar is now in the right place in relation to the floor. In the case of a sitting avatar, you would probably input another location that defines where the avatars butt should be to sit in the chair. I fully expected the full body avatars to work like this. Having a body and legs adds the extra difficulty of using IK to animate those other parts but I figured Meta had some good algorithms for figuring that out and would provide a simple way of giving inputs and getting an animated avatar as a result. What I got: Meta decided that bifurcating the first and third person avatar animation and restricting the appearance of the third person avatar to remain in realistic positions was more important than the reliability of the position data. The entire system is built around displaying different head and hand positions in third person compared to first person. Crouching does not just match the avatars head to the position of the user, it measures how far down your head is and then plays a crouching animation to a level that mostly lines up. Hand positions are then placed relatively to the head to maintain a normal body structure. This causes a tornado of problems if you want to do anything that does not come pre packaged in the SDK. Even just allowing the user to sit in a chair and freely look around 360 degrees is not included. The user pointing at a specific spot in the environment or at a specific place on another user's body becomes this confusing multiverse conversation about everyone seeing different things. Metas hybrid of IK and normal rigged animations is a nightmare if you want to accommodate more than one thing. Our app allows users to switch between standing, walking, sitting on the floor, or sitting in various chair sizes and shapes seamlessly. We also have a lot of objects that the player can grab and move which track their position independently and synchronize it over the network. The provided sitting behavior that you can find in the LegsNetworkLoopback scene is totally unusable for me. All of the movements of the users head are clamped to stay still in third person, meaning that lots of body language is removed and also objects they may be interacting with appear to float around unconnected to their hands because the first person hand position that the local user sees gets totally altered by the sitting animation. I had to make my own alteration on all of the crouching animations to get a more versatile sitting animation in which the user could actually move their head and be seen doing so. One of the things we rely on to make all of our different seating scenarios work is that we apply offsets within the rig to raise or lower the user and get their head to end up in the right place. You can remain sitting in real life but we will adjust where your head ought to end up in relation to the floor by shifting your play space around. This totally wreaks havoc on the meta system. Just to make it so that the user could transition between virtually sitting and standing without changing positions in real life and KEEP THE AVATAR HEAD IN SYNC with their actual head position was a large undertaking. I think one of the biggest problems is that the rig that applies animations to the avatar does not even match up with the position of the rig in the scene. You have this totally invisible rig off in the middle of nowhere that defines what the avatar will look like for others but does not actually line up with anything in the local scene. There are just so many scenarios in which the third person rendering of the avatar deviates greatly from what the user in first person is actually doing that networking a sensible world in which all users are experiencing the same thing becomes a struggle. We used to have high fives that worked pretty well and now everyone's hands render slightly different in third person and it ruins the feature. Meta have abandoned the one idea that I would have thought to be the most obvious critical feature. In first and third person, the head and hands of the avatar should always match the inputs given by the user. TLDR: Zuckerberg was clearly scared by everyone making fun of the avatars before and so meta ended up sacrificing absolutely everything to put large restrictions on the movement of the third person avatar to keep them from looking silly. For Horizon that's great, for a bunch of apps that were built on a different system and would like to be able to provide the same inputs, it's a nightmare. My request: Please add a normal IK system, where all I do is tell the avatar where my head, hands and butt should end up and it will do the rest. I understand that I'll get some funny VR chat looking stiff movement or stretched limbs from this but at least the position data will be reliable and I won't have to figure out this complicated puppeteering system that only renders for other users.GlimpseGroupDev4 days agoProtege3likes1CommentYellow overlay over certain scene objects
Hi there! We've encountered a strange bug in our Unity build where some scene objects appear with a yellow overlay when running on Meta Quest 3 or 3S devices. This issue does not occur on the Meta Quest 2. The bug appears intermittently—sometimes, after restarting the application, the objects display the correct colors, while other times, the yellow overlay returns. What do you think this could be the problem with? Please let us know if you need more information. Unity Version: 6000.0.51f1 SDK: Open XR v1.14.3 Devices: Can be reproduced on the Meta Quest 3/3S, can not be reproduced on Meta Quest 2 Graphics API: OpenGLES3BidOnGamesStudio4 days agoHonored Guest0likes1Comment