Black screen after removing and reattaching headset, display on but no image rendered (2.1 & 2.3)
Overview After taking the headset off and putting it back on, the display stays completely black. Audio continues playing and controllers can still interact with UI elements, confirming the session is active and the display is on. This is not intermittent, it occurs frequently and has been reproduced across multiple Quest 3 units on both Horizon OS 2.1 and 2.3. Affected apps First Encounters (system app) Horizon Store apps Custom / developer apps Confirmed on Multiple Meta Quest 3 units Horizon OS 2.3 (all units affected) Horizon OS 2.1 (some units affected) Not present on v85 Steps to reproduce Put on Meta Quest 3 running Horizon OS 2.1 or 2.3 Open any app (system, store, or developer build) Remove the headset Put the headset back on Screen remains black, while audio and controller input still active. Workarounds found Hand tracking Moving a hand in front of the headset cameras triggers hand detection, which resolves the black screen. Controllers Pressing the Meta home button restores the display. Holding controllers in view of cameras is not sufficient. Developer impact This issue directly breaks automated onboarding flows. Our app is designed so onboarding requires no manual intervention, but users are now forced to interact with the home button or manually wave their hands to recover from the black screen. Which defeats the purpose of a seamless first-run experience. Version & unit breakdown OS version Units tested Result v85 (pre-2.x) 2 No issue Horizon OS 2.1 3 Intermittent, not all units Horizon OS 2.3 3 All units affected If more info is needed, let me know. Sincerely, Raymond12Views0likes0Comments[BeatGesture] A Hand Tracking VR Rhythm Game
Hi everyone, I’ve been working on a small VR rhythm game called BeatGesture. It uses hand tracking and gestures instead of controllers, matching timing, hand shapes, and direction. If you’ve played other VR rhythm games, the core structure might feel familiar. But I wanted to explore what happens when you go all-in on hands as the primary input. The goal was to push gesture-based interaction further, rather than relying on controller-driven mechanics. It feels less like hitting notes and more like performing choreography. If you're interested, here's the store page: https://www.meta.com/en-gb/experiences/beatgesture/35694765943455719/ Thanks!
17Views0likes0CommentsMeta Quest Unity Real Hands Building Block not showing real hands
Hi all! I'm somewhat new to VR development, especially in mixed reality. I am trying to use Meta's Real Hand building block, but I can't seem to get it to work. I have a very basic scene with some of the fundamental building blocks (camera rig, passthrough, passthrough camera access, interaction rig), along with the real hands building block and a single cube. When I build the project to my Quest (Meta Quest 3), and move my hands in front of the cube, I can only see the virtual hands - the occlusion does not work to show my real hands (i.e. it works the same as it did before I added the Real Hands building block). Why is this and how can I fix it? Unity Version: 6000.3.4f1 Meta Quest Packages: Meta XR Core SDK (85.0.0), Meta MR Utility Kit (85.0.0), Meta XR Interaction SDK (85.0.0) Steps to Replicate: Create a new empty scene Add the following building blocks: Camera Rig Passthrough Passthrough Camera Access Interactions Rig Real Hands Add a cube at (0, 0, 3) Build the project and deploy to the Quest Wave your hands in front of the cube - only virtual hands are visible, not real hands64Views0likes1CommentHands only not working on Unity
On Unity 6, with the V85 of Meta SDK, I choose in "Hand Tracking Support" the "Hands only" mode so controllers aren't supposed to be available in build. However, controllers are still available in my app so I tried to use the "Controller Only" mode so it disable correctly the hand tracking in my app and it shows up a window to use controllers when I try to lauch my app. Why it's not making the same for hand tracking mode ?Record and replay real hand pose at runtime (Meta Interaction SDK – Unreal)
Hi everyone, I’m currently working with the Meta Interaction SDK in Unreal Engine (UE 5.6) and using hand tracking only (no controllers). I’m using the ISDK Hand Rig Component for both hands. What I’m trying to achieve is: I want to capture the user’s real hand pose at runtime, save that pose, and then reapply (replay) that exact pose later on command. So, basically my requirement is, Is there a built-in way in Meta Interaction SDK to record and replay hand poses ? What’s the best way to temporarily override hand tracking and apply a custom pose? Any guidance, suggestions, or pointers would be really helpful. help in any way you can. Thanks.37Views0likes1CommentIs there a way to make the player not see their own avatar
Hello, I am building a Unity Project with Meta All-In-One SDK and I'm using the Networked Avatar Building Block on top of Matchmaking and Hand Tracking and Passthrough to create an experience where users can see other avatars with their hand movements with Passthrough in the real world , this creates an effect where you can see people walking around, talking and interacting with things in a room that they are not physically in with passthrough. My issue with this is the fact that the player (host) can see other players' Avatar and other players can see the host's avatar but when the host themselves also look down they also see their own avatar arms connected to their hands and body. I do not really want this Is there way for the player to just see their normal hand prefabs/meshes from their perspective while other connected players see the full avatar and vice versa?19Views0likes1CommentUnity Editor Crash During Play Mode with Link and Hand/Body tracking
Description: Since updating to Meta Horizon Link v85.0.0.239.552, developers using Unity 6 experience an immediate Crash to Desktop upon entering Play Mode. The crash occurs during the initialization of XR tracking subsystems. The error is originating from XR_EXT_hand_tracking.dll and/or XR_FB_body_tracking.dll located in C:\Program Files\Meta Horizon\Support\oculus-runtime\client-plugins\x64\. Environment Details: Unity Version: Unity 6.2 (6000.2.14f1) Meta Horizon Link Version: 85.0.0.239.552 (85.0.0.239.552) Horizon OS Version: 2.3 (Build 5208318.1010.520) Error Signature: Exception Code: 0xc0000409 (Stack Buffer Overrun) Faulting Modules: XR_EXT_hand_tracking.dll and/or XR_FB_body_tracking.dll Path: C:\Program Files\Meta Horizon\Support\oculus-runtime\client-plugins\x64\ Workaround: The .bak Method If you are encountering this crash and need to continue development using controllers, you can "hide" the unstable tracking DLLs from the Unity Editor. This bypasses the memory fault during the Play Mode handshake. Close Unity and the Meta Horizon Link PC app. Navigate to: C:\Program Files\Meta Horizon\Support\oculus-runtime\client-plugins\x64\ Locate XR_EXT_hand_tracking.dll and XR_FB_body_tracking.dll. Rename them by adding .bak to the end (e.g., XR_EXT_hand_tracking.dll.bak). Restart Unity. Note on PTC: This version (85.0.0.239.552) is currently circulating in the Public Test Channel. If you are enrolled in PTC, opting out may roll you back to a stable version, though the .bak method is a faster fix for those who need to stay on v85/Horizon OS 2.3 for other features. Note on Builds: This only affects the PC Link Runtime used for Editor Play Mode. Native Quest builds (.apk) are unaffected as they use the headset's internal drivers; tracking will function normally on-device. Link to first access post of same issue43Views1like0CommentsComparative Review: Functional Similarity and IP Liability in Hand Tracking
If the end result—the specific hand poses and gestures—is functionally identical to Meta's, can they take legal action against me even if I used a completely different implementation method? Does legal liability extend to the user experience and functionality rather than just the source code?Solved32Views0likes1CommentDrone Core Command
Summary: Drone Core Command is a gesture-based pet combat game where players command their pet mech to take control of the battlefield. Players can customize their loadout / mech abilities using 'drone cores', and will be able to repair their mechs after a battle back at their hub. This first week of development focused on establishing viability for the concept and main features that will be used in combat. Players can use their left hand to spawn a series of drone cores to pick from, and install them in their gun mounted on their right arm either for direct damage or mech targeting. Features: Inventory - players can gesture to open and select a drone core from their inventory to place in their arm gun Arm gun - either a source of direct damage or targeting for the players pet drone Mech stats - a display mounted on the player's inventory arm that will show the state of the mech and any selected drone cores being used Autonomous vs Player assisted targeting - if players decide to use a drone core that is for mech control, they can point at their desired targets to control their mech Map Overview - this will be the way players can move their mech around the field at greater distances (not sure if this feature will make it all the way to competition end, but am thinking it's still worth testing to explore additional game-board like gesture control in a boardgame like setting). At the very least, the map overview will provide battlefield stats UX Challenges Context sensitive and accurate gesture detection is pretty challenging. Hand poses that are detected at the right time, and in the right context, are an important part of the game feeling intuitive and easy to play. This includes designing the game so the player never needs to block visibility of the sensors that might hitch the detection of the player's hands. No player movement - this game is designed specifically to bring the world to the player rather than move the player around the world. Players won't need to move around as the focus will be on loadout interfaces and battlefield control from a distance. Next Steps Player journey - building out the framework of the game so players can start at a main menu scene, load into their hub, deploy to the battlefield, and return to the hub. Building the basic functionality for the hub - this is where players can select their drone cores from their library, (or maybe even augment their drone core for additional player agency), select and review a mission map, and repair their mech from previous battle damage. A stretch goal here would be to build out the gesture-based mech repair system Map overview iteration - am hoping to build a simple pick-up-and-place-the-mech type system on the battlefield map to move the mech around larger distances, but maintaining good player visibility for enemy targeting. Inspiration IP - have always enjoyed mech-type IP. My early days of gaming were inclusive of running around in a timberwolf. Input - I did a lot of research in motion-controller interactions when I built MageWorks (VR game on Quest), and targeting systems when I built BlastPoint (mobile AR game). Being able to take the learnings to the next step with gestures is part of my motivation for this project. Gameplay - was looking to combine pet-based combat/player agency (e.g. as exhibited in WoW for pet-based players classes like the hunter and warlock) with a hybrid strategic/tactical style of play much like old-school games such as final fantasy advance tactics. (that's where i want to try moving the mech around to different quadrants/regions as a form of strategy to both keep it alive, and giving players a sense of target priority at both a macro and micro scale. Credits While i'm working on this project as a solo dev, i'm using the marketplace for art, sound, and animation assets as placeholders while i focus on scripting the game in UE5. Many thanks to the marketplace community, forum contributors, and overall XR community for the many resources they have made available over the years. If you made it this far, thanks for reading, and feel free to reach out with any questions or feedback. Once I get a build up and running in a pre-alpha channel on the Quest app store, am happy to add folks to the build for testing. In the meantime, will try and keep this thread updated as the game progresses.
162Views0likes8Comments