Passthrough Camera Access (PCA) + AI Building Blocks on Quest 3 | Developer Workshop
With Passthrough Camera Access (PCA), your Quest 3 app reads the real-world camera feed as GPU textures you can use in Unity for mixed reality. In this session, the Meta XR team walks through the new PCA component in the Meta XR SDK and its privacy-first design that replaces the internal Medici prototype. You’ll see how PCA connects with AI Building Blocks for object detection and guidance, and how to use these tools in your submission to the 2025 Meta Horizon Start Developer Competition. 💡 By viewing this session, you’ll learn how to: Set up the PassthroughCameraAccess component in Unity and choose the right permissions and resolution for your use case. Map 2D coordinates from the camera to 3D rays with depth so virtual content stays locked to real surfaces. Use AI Building Blocks for on-device object detection and LLM prompts that respond to what the camera sees. Plan a PCA-powered prototype that meets the requirements of the Meta Horizon Start Developer Competition. 🎬 CHAPTERS 👋 INTRODUCTION & OVERVIEW 🕒 00:00 - Introduction and Agenda 🕒 00:52- What Is Passthrough Camera Access (PCA)? 🕒 03:20- Inspiring Use Cases: Industry & Community Projects 🛠️ TECHNICAL IMPLEMENTATION 🕒 07:29 - Technical Deep Dive: The New PCA Component in SDK v81 🕒 12:10 - New & Upcoming PCA Features 🕒 15:52 - Mastering Coordinate Systems: 2D to 3D and 3D to 2D 🤖 AI INTEGRATION & OPTIMIZATION 🕒 19:47 - Introducing the AI Building Blocks 🕒 22:51 - Deep Dive: The Object Detection Building Block ✅ Q&A AND BEST PRACTICES 🕒 24:46 - Competition Details & Q&A Kick-off 🕒 28:14 - Q&A: Handling Latency and Inspiring Use Cases 🕒 40:44 - Q&A: PCA vs. Scene API & Visual Design Best Practices 🕒 45:46 - Q&A: Offline Voice Recognition & Final Remarks 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
15Views0likes0CommentsHow to Understand your Player and their Space : Context Aware MR
Hey folks! I just ran last week a small workshop about Context-Aware MR on how to understand your player and their space. I wanted to drop a summary here, and I’m looking forward to your tips, ideas, or comments. 😊 But, here is the big question. What are the tool we can use to really “understand” a player? Here are the four musketeers of Context Aware MR : Player Input, MRUK, Depth API, PCA. 1) Player Input You might think player input is a bit trivial but there’s actually a lot we can extract from it to understand our player behaviour. Of course, we get the basics: the position and rotation of the head and controllers, plus input from buttons or triggers. But that’s just the beginning. With features like hand tracking and voice recognition, we unlock much more. Hand tracking lets us detect custom hand poses or gestures. Voice recognition allows for voice commands and even detecting voice loudness, which can be used to animate a character’s mouth or trigger actions like blowing out a candle. By combining head and controller tracking, we can figure out not only where the player is looking but also estimate their overall body pose. These are just a few examples, but the creative possibilities are huge. While these features also apply to standard VR, let’s now move to tools that are specific to Mixed Reality starting with our next musketeer : MRUK ! 2) MRUK To understand MRUK I need to first explain what the Scene Data is. The Scene Data is an approximation of the player's environment, set up outside of your app through the Meta system. It gives you access to either a full mesh or a simplified version of the room using labeled boxes that lets you identify if this elements is a wall, door, floors or a furniture. The Mixed Reality Utility Kit (MRUK) is a powerful set of tools built on top of Scene Data. It helps you place, align, and make virtual content interact with the real world. Here are some examples of what MRUK enables: Smart spawn points on specific surfaces (like the floor or walls) while avoiding obstacles (like furniture) Collision for your virtual content Navmesh to move object around the player’s space without bumping into real-world elements Destructible scene mesh effects Dynamic lighting effects on real world QR code and keyboard tracking And more... While MRUK is incredibly useful, keep in mind that Scene Data doesn’t update in real time. That’s where the Depth API comes in. 3) Depth API The Depth API gives you real-time depth maps of what the user is currently seeing. This allows you to occlude virtual objects behind real-world elements, making them feel like a natural part of the environment and greatly increasing immersion. It also comes with a Depth Raycast Manager, which lets you detect collisions at runtime with real objects perfect for dynamic content placement or interactions. It’s a great complement to the Scene Model, filling in the gaps that static scene data can’t cover. Despite its potential, it's still underused in many XR projects. 4) Passthrough Camera Access (PCA) We’ve had the first three tools for a while now. But recently, a game-changing feature was introduced: access to the passthrough camera! With access to the camera, you can: Read the live image as a texture to do color picking, light estimation, or apply visual effects like blur Feed the image to AI models for computer vision tasks like object detection It opens a direct bridge between the real world and AI and that's huge for MR development. Good news: starting with version v83, new building blocks are available to help you set up PCA easily in your project. To Conclude Player Input, MRUK, Depth API, and Passthrough Camera Access form a powerful toolbox for building context-aware MR experiences. And now, with tools like PCA, creativity is more accessible than ever. We can finally build apps and games that truly adapt to each user and their real space. Hope you enjoyed this little summary a nd that you learned something new along the way. Go check out the different link provided in the post if you want to learn more about our 4 musqueteers and if you have a tip on how you these features in your app share them down bellow! 😊 Usefuls link : https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview https://developers.meta.com/horizon/documentation/unity/unity-mr-utility-kit-overview https://developers.meta.com/horizon/documentation/unity/unity-depthapi-overview https://developers.meta.com/horizon/documentation/unity/unity-pca-overview Have a nice day! 👋46Views1like0CommentsThe Complete List of Sample Unity VR Projects
Hey guys, I wanted to put together a list of my favorite sample projects that you can grab and learn from. In my opinion, these projects are pure goldmines, they don’t just showcase design principles around specific features but also provide direct examples of how to use them, which is especially important right now for something like a hackathon. For an even larger collection of Meta samples, see the GitHub list of all Meta sample repos here: https://github.com/orgs/oculus-samples/repositories?type=all Let’s start with our first category, the interaction samples. Interaction Samples Meta XR All-In-One (Interaction SDK) Sample Links: https://github.com/oculus-samples/Unity-InteractionSDK-Samples https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 Description: A comprehensive demo from Meta’s XR Interaction SDK featuring core VR interactions like poking, grabbing, raycasting, UI, and locomotion, all working together. Perfect to understand how to integrate both hands and controllers in one system. First Hand Link: https://github.com/oculus-samples/Unity-FirstHand Description: A full VR game demo focused on hand-tracked interactions. It showcases a complete Unity experience using the Interaction SDK with hand tracking as the main input and controller fallback. XR Interaction Toolkit Examples Link: https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples Description: Unity’s official XR Interaction Toolkit samples showing how to implement locomotion, selection, grabbing, and UI interactions. A solid starting point for setting up XR Origin and interactor/interactable components. Move Fast Link: https://github.com/oculus-samples/Unity-MoveFast Description: A fast-paced VR fitness demo using hand tracking and the Interaction SDK. The sample shows how to build an energetic workout game with responsive, punch-based interactions. Whisperer Link: https://github.com/oculus-samples/voicesdk-samples-whisperer Description: A voice-controlled VR experience demonstrating the Meta Voice SDK. Use voice commands as part of gameplay to learn how to integrate real-time voice recognition into your own projects. Tilt Brush (Open Brush) Link: https://github.com/icosa-foundation/open-brush Description: An open-source continuation of Google’s Tilt Brush. Lets users paint and sculpt in 3D space — an excellent reference for creative VR tools and spatial drawing. Multiplayer & Social Samples VR Multiplayer Sample (Unity XRI) Link: https://docs.unity3d.com/Packages/com.unity.template.vr-multiplayer@2.0/manual/index.html Description: Unity’s official multiplayer VR template featuring a prebuilt scene, avatars, and networking setup using Netcode for GameObjects. Great for learning multi-user interactions in VR. Mixed Reality Multiplayer (XR Multiplayer) Sample Link: https://docs.unity3d.com/Packages/com.unity.template.mr-multiplayer@1.0/manual/index.html Description: A tabletop MR multiplayer demo that includes avatars, voice chat, and shared AR/VR spaces. Features games like balloon slingshot and chess while teaching MR networking and colocation concepts. Tiny Golf Link: https://github.com/Meta-Horizon-Start-Program/Tiny-Golf Description: A free-to-play multiplayer mini-golf VR game created for the Meta Start program. Demonstrates basic physics, scoring, and networked multiplayer. Ultimate Glove Ball Link: https://github.com/oculus-samples/Unity-UltimateGloveBall Description: A VR e-sport showcase demonstrating multiplayer, avatars, voice, and in-app purchases. Integrates Photon networking and Oculus social APIs, making it a great reference for social competitive games. Spirit Sling Link: https://github.com/oculus-samples/Unity-SpiritSling Description: A social MR tabletop game letting players place a shared game board in real space and invite friends to join. Highlights Avatars SDK and MR colocated play. Decommissioned Link: https://github.com/oculus-samples/Unity-Decommissioned Description: A social-deduction VR game inspired by titles like Among Us. Shows how to handle multiplayer lobbies, Oculus invites, and social APIs in a networked Unity project. Mixed Reality (MR) Samples A World Beyond (Presence Platform Demo) Link: https://github.com/oculus-samples/Unity-TheWorldBeyond Description: A full MR showcase combining Scene Understanding, Passthrough, hand tracking, voice input, and spatial audio. A must-see for developers building immersive MR scenes blending real and virtual spaces. Phanto (MR Reference App) Links: https://github.com/oculus-samples/Unity-Phanto https://developers.meta.com/horizon/blog/phanto-unreal-showcase/ Description: An MR reference app focused on environmental awareness. Uses the Scene Mesh and MR APIs to blend gameplay with real-world geometry. Unity Discover (featuring Drone Rage and others) Links: https://www.meta.com/en-gb/experiences/discover/7041851792509764/ https://github.com/oculus-samples/Unity-Discover Description: A collection of MR showcase mini-experiences like Drone Rage. Demonstrates MR features including Passthrough, Spatial Anchors, and Shared Anchors in various game prototypes. MR Motifs Link: https://github.com/oculus-samples/Unity-MRMotifs Description: A library of MR “motifs”, small, reusable templates showcasing mechanics such as passthrough transitions, colocated multiplayer, and instant content placement. Cryptic Cabinet Link: https://github.com/oculus-samples/Unity-CrypticCabinet Description: A short MR escape-room experience that adapts to your room’s layout. Demonstrates interactive storytelling in mixed reality using environmental awareness. Passthrough Camera API Samples Link: https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples Description: A sample project demonstrating how to access and process Quest’s Passthrough camera feed for effects, object detection, and image manipulation. Tool and Utilities Asset Streaming Link: https://github.com/oculus-samples/Unity-AssetStreaming Description: An open-world streaming sample that shows how to dynamically load content using Addressables and LOD systems — ideal for maintaining performance in large VR environments. Shader Prewarmer Link: https://github.com/oculus-samples/Unity-ShaderPrewarmer Description: A utility sample that preloads shader variants at startup to eliminate hitching or stutters when shaders first compile — an important optimization for smooth VR performance. Complete Game Showcase Northstar Link: https://github.com/oculus-samples/Unity-NorthStar Description: A complete VR game showcasing advanced interaction and visual technique for VR. Featuring rope physics, narrative story telling, lip sync and more.199Views6likes2CommentsWhat's Trending in VR Development? | Tech, Content & Community
What does it take to succeed as a VR developer today? Join renowned YouTube creators and expert developers Dilmer Valecillos, Quentin Valembois, and Roberto Coviello in a roundtable discussion hosted by Darby Blaney. They explore the latest tech trends, such as the Passthrough Camera Access API and essential optimization tools, and share proven strategies for accelerating your workflow, creating content, and building an engaged community. 💡 In this session, you’ll learn: Insights into the latest VR technologies, including the creative potential of the Passthrough Camera Access API and underutilized tools like the Immersive Debugger. Best practices for optimizing performance and accelerating your development workflow using tools like Building Blocks and open-source projects. Proven strategies for building a personal brand, creating effective content, and fostering an engaged community around your work. 🎙️ HOST Darby Blaney, Metaverse Program Manager, Start Lead 🎙️ SPEAKERS Roberto Coviello, Software Engineer Quentin Valembois, Start Mentor Dilmer Valecillos, Developer Advocate ▶️ VIDEO CHAPTERS 👋 Introduction 🕒 00:00 - Welcome & Host Introduction 🕒 00:37 - Meet the Panelists 🚀 Part 1: The Developer Journey 🕒 02:13 - Panelist Origin Stories: How They Got Started in VR ⚙️ Part 2: Tech Trends & Tools 🕒 08:22 - Tech Trend: The Passthrough Camera API & AI 🕒 13:12 - Underutilized Tools: Immersive Debugger & Runtime Optimizer 🕒 17:12 - Best Practices for Performance Optimization 🕒 19:41 - Accelerating Development with Building Blocks & Open-Source Samples 📈 Part 3: Content & Community Growth 🕒 23:20 - Strategies for Content Creation & Building a Personal Brand 🕒 28:18 - The Role of Community Feedback in Development 🕒 32:47 - Inside the Meta Start Mentor Program ✅ Conclusion 🕒 35:06 - Final Thoughts & Thank You 📚 RESOURCES ➡️ Dilmer’s YouTube channel: https://www.youtube.com/@dilmerv ➡️ Valem’s YouTube channel: https://www.youtube.com/@ValemTutorials ➡️ Roberto’s YouTube channel: https://www.youtube.com/@xrdevrob ➡️ Spatial Scanner project available on GitHub: https://github.com/meta-quest/Meta-Spatial-SDK-Samples ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start55Views1like0Comments