Meta Quest Unity Real Hands Building Block not showing real hands
Hi all! I'm somewhat new to VR development, especially in mixed reality. I am trying to use Meta's Real Hand building block, but I can't seem to get it to work. I have a very basic scene with some of the fundamental building blocks (camera rig, passthrough, passthrough camera access, interaction rig), along with the real hands building block and a single cube. When I build the project to my Quest (Meta Quest 3), and move my hands in front of the cube, I can only see the virtual hands - the occlusion does not work to show my real hands (i.e. it works the same as it did before I added the Real Hands building block). Why is this and how can I fix it? Unity Version: 6000.3.4f1 Meta Quest Packages: Meta XR Core SDK (85.0.0), Meta MR Utility Kit (85.0.0), Meta XR Interaction SDK (85.0.0) Steps to Replicate: Create a new empty scene Add the following building blocks: Camera Rig Passthrough Passthrough Camera Access Interactions Rig Real Hands Add a cube at (0, 0, 3) Build the project and deploy to the Quest Wave your hands in front of the cube - only virtual hands are visible, not real hands24Views0likes0CommentsHow to Set Up Hand Tracking with the Meta All-in-One SDK | Mentor Workshop
If you're developing for Meta Quest and want reliable hand interactions, this is your starting point. In this session, Start Mentor Quentin Valembois (Valem) walks through the All-in-One SDK from foundational setup to advanced features, covering Building Blocks and Quick Actions before moving into gesture inputs and custom throwing physics. Plus, stick around until the end to catch the live Q&A featuring a member of Meta's Input Framework team. π¬ CHAPTERS π 00:00: Introduction π 01:13: Setting Up Hand Tracking π 08:41: The Interaction SDK π 14:48: Automating Setup with Quick Actions π 22:08: Triggering Inputs with Hands π 26:29: Advanced Locomotion & Physics π 33:22: Q&A with Meta's Input Framework Team π RESOURCES π Featured GitHub Project: https://github.com/Meta-Horizon-Start-Program/MasterHandTracking πΊ Meta SDK V83: New Hand Tracking Locomotion Features: https://www.youtube.com/watch?v=V5BudZA9b9Q β‘οΈ Developers Blog: https://developers.meta.com/resources/blog/ β‘οΈ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ π CONNECT WITH US β‘οΈ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter π‘ LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
126Views0likes0CommentsHands only not working on Unity
On Unity 6, with the V85 of Meta SDK, I choose in "Hand Tracking Support" the "Hands only" mode so controllers aren't supposed to be available in build. However, controllers are still available in my app so I tried to use the "Controller Only" mode so it disable correctly the hand tracking in my app and it shows up a window to use controllers when I try to lauch my app. Why it's not making the same for hand tracking mode ?π Workshop | Practical Hand Tracking in Unity
Are you ready to level up your app's immersion? Join Start Mentor Valem for a hands-on workshop building on last month's advanced hand tracking session, where you will work directly in a Unity project, implementing finger pinch, microgestures, and hand pose recognition. By the end of this session, you'll know how to read these inputs, integrate them step by step, and compare them to choose the right interaction method based on best practices and design goals. Join on Zoom169Views0likes0Commentsπ» Whatβs New: Meta Horizon OS v85
V85 of the Meta XR SDK brings powerful new tools and improvements for your XR projects. Join Start Mentor Valem for a rundown of the top features and changes you can implement right away. Spatial Test Framework provides automated testing capabilities for validating your experiences across multiple room configurations. Multiplayer Building Blocks now support Photon Fusion 2.1, enabling more robust and scalable multiplayer experiences. Discover advanced AI Building Blocks with image segmentation and bounding box support. Explore enhancements to hand and controller interactions, including unified ray visual indicators and improved locomotion controls. Get guidance on important deprecations and how to future-proof your projects. Hear directly from mentors, ask questions, and see how you can leverage the latest in Meta Horizon OS v85 to elevate your development. Join on Zoom258Views0likes0CommentsRecentering gesture for impaired users
I am building a hand-control-based VR app for users with impaired mobility. I have two challenges related to the pinch-and-hold gesture for recentering. For some users the gesture is exceedingly hard or impossible to perform (false negative). For others, the gesture is sometimes triggered accidentally (false positive). I understand Meta's desire to keep this gesture universal across all third-party apps. Unfortunately, it is not universally viable to all users. I need a solution to this problem or my app will never ship. I am prepared to roll my own recentering system that manipulates the in-game view in response to a hardware "easy button" press. However, to implement this solution I still need to know when an actual pinch-and-hold gesture is performed, so that I can properly recalibrate my own system. Unfortunately I have not found any functioning API or telemetry that might hint that this has happened. I have tried several OpenXR and Meta Core APIs but they all seem to be no-ops on the Quest 3. Can anyone recommend a solution? I'm using Unity 6.3, OpenXR, and the Meta Core SDK. I do not depend on any other Meta SDKs but am willing to add them if they solve this problem.17Views0likes0CommentsStringscape: Turning Hand Distance into Pitch
Iβm currently building a Quest experience called Stringscape, and I wanted to share the core idea and get feedback from other developers here. The concept is simple: You stretch a glowing βstringβ between your two hands, and the world-space distance between them controls the pitch. Closer hands β higher pitch Farther apart β lower pitch The experience is designed to be more of a creative playground than a structured music tool. Iβd love to hear your thoughts. Itβs currently in Early Access on Quest as well if anyone is curious to try it. Thanks!
15Views0likes0Commentsβ Whatβs New: v83 Meta SDK Update
V83 of the Meta XR SDK introduces major improvements to core XR functionalities. Join Start Mentor Valem to hear a rundown of the top features and changes you can implement in your project right away. See the new AI Building Blocks, enhancements to hand tracking, new locomotion options, physics based hand interactions and more. Join on Zoom676Views2likes6Comments