Build Intuitive Experiences with Hands & Microgestures
You can build powerful immersive experiences with Hand Tracking using the Meta Interaction SDK. Watch as Meta Engineering Manager Jesse Keogh shows you the essential steps to go from a blank Unity project to a fully interactive prototype with UI and 3D object interactions. You’ll walk away with production-ready techniques for Quest-specific optimization and system-backed gestures like Pinch and Microgestures that deliver reliable input across all users. 💡After viewing this session, you’ll understand how to: Configure the Interaction SDK using Quick Actions Implement system-backed gestures like Pinch Streamline project creation with the Meta Quest Developer Hub Test interactions using the XR Simulator 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Introduction to the Hands Workshop 🕒 02:41 - Overview of Hand Tracking System Features ⚙️ CORE CONCEPTS 🕒 05:35 - System-Backed Gestures & Interaction SDK 🕒 08:25 - Tour of Interaction SDK Capabilities 🕒 11:48 - Design Principles & Best Practices 🛠️ Workflow & Demo 🕒 14:23 - Project Setup with Meta Quest Developer Hub 🕒 20:39 - Live Demo: Building UI Interactions 🕒 34:02 - Live Demo: Adding Grabbable 3D Objects ✅ FINAL THOUGHTS 🕒 37:15 - Q&A: Advanced Use Cases & Debugging 📖 HAND TRACKING EXAMPLES REFERENCED IN THIS VIDEO 🔖 Interaction SDK Overview: https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ 🔖 Hand Tracking Design Guidelines: https://developers.meta.com/horizon/design/hands/ 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
5Views0likes0CommentsHow to Understand your Player and their Space : Context Aware MR
Hey folks! I just ran last week a small workshop about Context-Aware MR on how to understand your player and their space. I wanted to drop a summary here, and I’m looking forward to your tips, ideas, or comments. 😊 But, here is the big question. What are the tool we can use to really “understand” a player? Here are the four musketeers of Context Aware MR : Player Input, MRUK, Depth API, PCA. 1) Player Input You might think player input is a bit trivial but there’s actually a lot we can extract from it to understand our player behaviour. Of course, we get the basics: the position and rotation of the head and controllers, plus input from buttons or triggers. But that’s just the beginning. With features like hand tracking and voice recognition, we unlock much more. Hand tracking lets us detect custom hand poses or gestures. Voice recognition allows for voice commands and even detecting voice loudness, which can be used to animate a character’s mouth or trigger actions like blowing out a candle. By combining head and controller tracking, we can figure out not only where the player is looking but also estimate their overall body pose. These are just a few examples, but the creative possibilities are huge. While these features also apply to standard VR, let’s now move to tools that are specific to Mixed Reality starting with our next musketeer : MRUK ! 2) MRUK To understand MRUK I need to first explain what the Scene Data is. The Scene Data is an approximation of the player's environment, set up outside of your app through the Meta system. It gives you access to either a full mesh or a simplified version of the room using labeled boxes that lets you identify if this elements is a wall, door, floors or a furniture. The Mixed Reality Utility Kit (MRUK) is a powerful set of tools built on top of Scene Data. It helps you place, align, and make virtual content interact with the real world. Here are some examples of what MRUK enables: Smart spawn points on specific surfaces (like the floor or walls) while avoiding obstacles (like furniture) Collision for your virtual content Navmesh to move object around the player’s space without bumping into real-world elements Destructible scene mesh effects Dynamic lighting effects on real world QR code and keyboard tracking And more... While MRUK is incredibly useful, keep in mind that Scene Data doesn’t update in real time. That’s where the Depth API comes in. 3) Depth API The Depth API gives you real-time depth maps of what the user is currently seeing. This allows you to occlude virtual objects behind real-world elements, making them feel like a natural part of the environment and greatly increasing immersion. It also comes with a Depth Raycast Manager, which lets you detect collisions at runtime with real objects perfect for dynamic content placement or interactions. It’s a great complement to the Scene Model, filling in the gaps that static scene data can’t cover. Despite its potential, it's still underused in many XR projects. 4) Passthrough Camera Access (PCA) We’ve had the first three tools for a while now. But recently, a game-changing feature was introduced: access to the passthrough camera! With access to the camera, you can: Read the live image as a texture to do color picking, light estimation, or apply visual effects like blur Feed the image to AI models for computer vision tasks like object detection It opens a direct bridge between the real world and AI and that's huge for MR development. Good news: starting with version v83, new building blocks are available to help you set up PCA easily in your project. To Conclude Player Input, MRUK, Depth API, and Passthrough Camera Access form a powerful toolbox for building context-aware MR experiences. And now, with tools like PCA, creativity is more accessible than ever. We can finally build apps and games that truly adapt to each user and their real space. Hope you enjoyed this little summary a nd that you learned something new along the way. Go check out the different link provided in the post if you want to learn more about our 4 musqueteers and if you have a tip on how you these features in your app share them down bellow! 😊 Usefuls link : https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview https://developers.meta.com/horizon/documentation/unity/unity-mr-utility-kit-overview https://developers.meta.com/horizon/documentation/unity/unity-depthapi-overview https://developers.meta.com/horizon/documentation/unity/unity-pca-overview Have a nice day! 👋46Views1like0CommentsThe Complete List of Sample Unity VR Projects
Hey guys, I wanted to put together a list of my favorite sample projects that you can grab and learn from. In my opinion, these projects are pure goldmines, they don’t just showcase design principles around specific features but also provide direct examples of how to use them, which is especially important right now for something like a hackathon. For an even larger collection of Meta samples, see the GitHub list of all Meta sample repos here: https://github.com/orgs/oculus-samples/repositories?type=all Let’s start with our first category, the interaction samples. Interaction Samples Meta XR All-In-One (Interaction SDK) Sample Links: https://github.com/oculus-samples/Unity-InteractionSDK-Samples https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 Description: A comprehensive demo from Meta’s XR Interaction SDK featuring core VR interactions like poking, grabbing, raycasting, UI, and locomotion, all working together. Perfect to understand how to integrate both hands and controllers in one system. First Hand Link: https://github.com/oculus-samples/Unity-FirstHand Description: A full VR game demo focused on hand-tracked interactions. It showcases a complete Unity experience using the Interaction SDK with hand tracking as the main input and controller fallback. XR Interaction Toolkit Examples Link: https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples Description: Unity’s official XR Interaction Toolkit samples showing how to implement locomotion, selection, grabbing, and UI interactions. A solid starting point for setting up XR Origin and interactor/interactable components. Move Fast Link: https://github.com/oculus-samples/Unity-MoveFast Description: A fast-paced VR fitness demo using hand tracking and the Interaction SDK. The sample shows how to build an energetic workout game with responsive, punch-based interactions. Whisperer Link: https://github.com/oculus-samples/voicesdk-samples-whisperer Description: A voice-controlled VR experience demonstrating the Meta Voice SDK. Use voice commands as part of gameplay to learn how to integrate real-time voice recognition into your own projects. Tilt Brush (Open Brush) Link: https://github.com/icosa-foundation/open-brush Description: An open-source continuation of Google’s Tilt Brush. Lets users paint and sculpt in 3D space — an excellent reference for creative VR tools and spatial drawing. Multiplayer & Social Samples VR Multiplayer Sample (Unity XRI) Link: https://docs.unity3d.com/Packages/com.unity.template.vr-multiplayer@2.0/manual/index.html Description: Unity’s official multiplayer VR template featuring a prebuilt scene, avatars, and networking setup using Netcode for GameObjects. Great for learning multi-user interactions in VR. Mixed Reality Multiplayer (XR Multiplayer) Sample Link: https://docs.unity3d.com/Packages/com.unity.template.mr-multiplayer@1.0/manual/index.html Description: A tabletop MR multiplayer demo that includes avatars, voice chat, and shared AR/VR spaces. Features games like balloon slingshot and chess while teaching MR networking and colocation concepts. Tiny Golf Link: https://github.com/Meta-Horizon-Start-Program/Tiny-Golf Description: A free-to-play multiplayer mini-golf VR game created for the Meta Start program. Demonstrates basic physics, scoring, and networked multiplayer. Ultimate Glove Ball Link: https://github.com/oculus-samples/Unity-UltimateGloveBall Description: A VR e-sport showcase demonstrating multiplayer, avatars, voice, and in-app purchases. Integrates Photon networking and Oculus social APIs, making it a great reference for social competitive games. Spirit Sling Link: https://github.com/oculus-samples/Unity-SpiritSling Description: A social MR tabletop game letting players place a shared game board in real space and invite friends to join. Highlights Avatars SDK and MR colocated play. Decommissioned Link: https://github.com/oculus-samples/Unity-Decommissioned Description: A social-deduction VR game inspired by titles like Among Us. Shows how to handle multiplayer lobbies, Oculus invites, and social APIs in a networked Unity project. Mixed Reality (MR) Samples A World Beyond (Presence Platform Demo) Link: https://github.com/oculus-samples/Unity-TheWorldBeyond Description: A full MR showcase combining Scene Understanding, Passthrough, hand tracking, voice input, and spatial audio. A must-see for developers building immersive MR scenes blending real and virtual spaces. Phanto (MR Reference App) Links: https://github.com/oculus-samples/Unity-Phanto https://developers.meta.com/horizon/blog/phanto-unreal-showcase/ Description: An MR reference app focused on environmental awareness. Uses the Scene Mesh and MR APIs to blend gameplay with real-world geometry. Unity Discover (featuring Drone Rage and others) Links: https://www.meta.com/en-gb/experiences/discover/7041851792509764/ https://github.com/oculus-samples/Unity-Discover Description: A collection of MR showcase mini-experiences like Drone Rage. Demonstrates MR features including Passthrough, Spatial Anchors, and Shared Anchors in various game prototypes. MR Motifs Link: https://github.com/oculus-samples/Unity-MRMotifs Description: A library of MR “motifs”, small, reusable templates showcasing mechanics such as passthrough transitions, colocated multiplayer, and instant content placement. Cryptic Cabinet Link: https://github.com/oculus-samples/Unity-CrypticCabinet Description: A short MR escape-room experience that adapts to your room’s layout. Demonstrates interactive storytelling in mixed reality using environmental awareness. Passthrough Camera API Samples Link: https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples Description: A sample project demonstrating how to access and process Quest’s Passthrough camera feed for effects, object detection, and image manipulation. Tool and Utilities Asset Streaming Link: https://github.com/oculus-samples/Unity-AssetStreaming Description: An open-world streaming sample that shows how to dynamically load content using Addressables and LOD systems — ideal for maintaining performance in large VR environments. Shader Prewarmer Link: https://github.com/oculus-samples/Unity-ShaderPrewarmer Description: A utility sample that preloads shader variants at startup to eliminate hitching or stutters when shaders first compile — an important optimization for smooth VR performance. Complete Game Showcase Northstar Link: https://github.com/oculus-samples/Unity-NorthStar Description: A complete VR game showcasing advanced interaction and visual technique for VR. Featuring rope physics, narrative story telling, lip sync and more.199Views6likes2CommentsVR Hand Interactions Explained: Developer Deep Dive for Meta Quest (Unity & Unreal)
Unlock the full potential of hand interactions on Meta Quest in this technical deep dive from members of the Meta Interaction SDK team. Product Designer Stella Mühlhaus and Software Engineer Dave Nelson explore the core principles, best practices, and powerful tools available to create intuitive and engaging hands-first interactions. 💡 In this deep dive, you’ll learn: The key advantages of hands over controllers: embodiment, ease of use, and flexibility. How to implement core mechanics like Direct Touch, Grabbing, Hand Posing, and Gestures. Advanced techniques for Locomotion, Improved Throwing, and physics-based Natural Interactions. The latest system-level improvements in hand tracking, including Hands 2.3 and fast motion updates. A complete overview of the Interaction SDK for both Unity and Unreal Engine. Whether you're starting a new project or adding hand tracking to an existing app, this session provides the expert guidance you need to make your hands interactions feel amazing. 🎙️ SPEAKERS Stella Mühlhaus, Product Designer Dave Nelson, Software Engineer ▶️ VIDEO CHAPTERS 👋 Introduction 🕒 00:00 - Welcome & Speaker Introductions 🕒 00:27 - Agenda Overview 🕒 00:47 - Why Build for Hands? Industry Trends & Core Benefits ✨ Core Interaction Mechanics 🕒 05:16 - Direct Touch & UI Interaction 🕒 06:04 - Grabbing, Posing & Object Interaction 🕒 07:27 - Pose & Gesture Recognition 🕒 08:14 - Hand Menus, Microgestures & Multimodal Input 🚀 Upcoming Features & Experiments 🕒 10:40 - Improved Throwing Mechanics 🕒 11:59 - Locomotion Experiments (Telepath, Walking Sticks, Climbing) 🕒 13:31 - Natural Interactions: Physics-Based Hands 🕒 15:39 - CARL: Custom Action Recognition Library ⚙️ System-Level Hand Tracking Improvements 🕒 17:21 - Overview of Core Tracking Updates 🕒 17:49 - Microgesture Improvements & Hands 2.3 🕒 19:55 - Fast Motion Improvements for High-Speed Tracking 🛠️ Building for Hands: Best Practices & Tools 🕒 21:25 - Best Practices for Designing Hands-First Experiences 🕒 25:31 - Deep Dive: Interaction SDK for Unity & Unreal 🕒 28:27 - Resources: Sample Apps to Get You Started 🕒 29:40 - Closing Remarks 📚 RESOURCES ➡️ Hands Design Guidelines: https://developers.meta.com/horizon/design/hands/ ➡️ Interaction SDK: https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview ➡️ Microgestures Open XR extension: https://developers.meta.com/horizon/documentation/unity/unity-microgestures 🔗 CONNECT WITH US ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start105Views1like0Comments