Building a Social VR Game From Scratch Part 1: Entitlement
So, I am building Baby VR, a social VR game that I will build with the community on YouTube. While planning the curriculum, I realized that before working on the core things like Networking, Voice Chat or Game Mechanics, we need to first integrate Baby VR into the Meta Horizon Store. And it starts with the Entitlement. So, in this blog post, I will show you how I did the Entitlement for the Meta Horizon Store. Let's get started. Introduction If you're building a VR app for Meta Quest, you absolutely need to implement entitlement checking. There's no way around it. Without it, anyone could potentially access your app without actually purchasing it from the Meta Quest Store. Think of entitlement as your app's bouncer - it checks if someone actually paid to get in before letting them through the door. Meta requires entitlement checks for apps published on their store, and it's really not optional if you want to protect your work and ensure users have legitimately obtained your application. According to Meta's official documentation. In this blog post, I'll walk you through a real-world implementation that handles all the edge cases - retry logic, error handling, and proper user data retrieval. Let's dive in. How It Works: The Complete Flow Before we get into the code, here's the big picture of how the entitlement process flows: The system consists of a few key components working together=> MetaStoreManager - The main orchestrator that kicks everything off EntitlementHandler - Does the heavy lifting of verification Event System - Notifies other parts of your game when entitlement completes MetaPlayerData - Stores the user info we retrieve Step-by-Step Implementation 1. The MetaStoreManager: Your Entry Point The `MetaStoreManager` is a Unity `MonoBehaviour` that orchestrates everything. It's simple - it initializes the entitlement handler and listens for when the entitlement completes: When you call `Initialize()`, it kicks off the entitlement process. Once complete, it stores the player data for use throughout your game. 2. The EntitlementHandler: The Core Logic This is where the real work happens. The handler performs a four-step verification process with automatic retry logic (up to 3 attempts with 2-second delays between retries): The `CheckEntitlement()` method runs four critical steps in sequence - if any step fails, the whole process fails and retries: Step 2 is the critical one- `CheckUserEntitlement()` calls `Entitlements.IsUserEntitledToApplication()` which queries Meta's servers to verify the user actually purchased your app. This is where the piracy protection happens. The other steps retrieve user data (ID, display name, Oculus ID) and generate a cryptographic proof (nonce) that you can use for server-side verification later. 3. The Data Structure After successful entitlement, you get a `MetaPlayerData` object containing: public class MetaPlayerData { public string UserId; // Unique user identifier public string UserName; // Display name public string AliasName; // Oculus ID public string OculusNonce; // Cryptographic proof for server verification } The`OculusNonce` is particularly important - it's a proof token you can send to your backend server to verify the user's identity securely. Best Practices When to check: Run entitlement as early as possible - ideally during your splash screen or initial loading. Don't let users access premium features until verification completes. Error handling: The implementation includes automatic retry logic (3 attempts with 2-second delays), but you should also show user-friendly error messages and provide a manual retry option if all attempts fail. Security: Never trust client-side verification alone. Always use the `OculusNonce` to verify user identity on your backend server for critical features. This prevents tampering and ensures real security. Performance: The async/await pattern keeps everything non-blocking, so your game stays responsive during the verification process. Common Issues and Solutions Entitlement always fails? Make sure your app is properly configured in the Meta Developer Dashboard, and test on a device that has actually purchased the app. Network issues can also cause failures. Platform not initializing? Verify the Oculus Platform SDK is properly imported and check your AndroidManifest.xml for required permissions. Also ensure you're testing on actual Quest hardware. User data not retrieved? The user needs to be logged into their Oculus account, and privacy settings might be blocking access. Check both the device settings and ensure you're using a compatible SDK version. Quick Integration Example Here's the basic pattern for using this in your game: Conclusion Meta Store entitlement isn't optional - it's a requirement for protecting your VR application. The implementation we've covered gives you: - ✅ Robust verification with automatic retry logic - ✅ Complete user data retrieval for personalization - ✅ Event-based architecture that keeps your code clean - ✅ Production-ready error handling Remember to test on actual Quest hardware, verify your app configuration in the Meta Developer Dashboard, and always implement server-side verification using the `OculusNonce` for critical features. This system provides a solid foundation that protects your app while keeping the user experience smooth. The retry logic handles network hiccups, and the event system keeps everything decoupled and maintainable. Let me know if you need the source code. Additional Resources Meta's Official Entitlement Check Documentation *This blog post is based on a production implementation. Always refer to the latest Meta documentation for the most up-to-date information and best practices.*18Views1like0CommentsHow to Polish your XR Title : Onboarding, Optimization and Game Feel
Hi guys, I recently ran a workshop for the Meta Horizon Start Competition focused on polishing an XR title. At that stage of the hackathon, with only a week left before submission, one question mattered more than any other: How do you make the biggest impact in the least amount of time? This post is a written summary of that workshop aiming at answering this question around three topics: Player onboarding Optimization Game feel But why these three? Because when you are close to shipping, it becomes critical to look at your application through the eyes of a player, not a developer. For example : The build can behave very differently on device compared to the editor, which makes optimization and debugging essential. If players do not understand what to do, onboarding needs improvement. If it works but feels flat or unsatisfying, game feel needs polish. Together, these areas tend to give the highest return on investment when time is limited. Let’s start with player onboarding, using the Meta All in One SDK. Player Onboarding Player onboarding is how you teach players to understand and interact with your game. A useful principle to keep in mind is: The best tutorial is no tutorial. Ideally, players learn by doing. You introduce interactions one at a time, following the player’s natural progression through the experience. That said, refining this approach requires time and playtesting, which is often limited near the end of a project. When time is short, these faster onboarding tools can help communicate essential information clearly. Tooltips Short, spatialized text pointing at an object or interaction. Best used to highlight a short action. UI Panels The most basic solution, using Unity UI to display text, images, or short videos. Ghost Hands Very effective for hand tracking. You can visually demonstrate a hand pose or motion. Tip: With hand tracking, you can click on play, do a certain pose with your own hand, then drag the hand model from the hierarchy to the project window to save it as a prefab that you can reuse and that will directly have the pose you want to show. Voice Prompts Recording short voice lines can be faster than writing UI and often feels more natural. Keep them short and contextual. I provide a Unity package with a sample scene demonstrating each of these approaches here: https://drive.google.com/file/d/1n3IUzLMH6_60foStzgSdR5WignTkWUFh/view?usp=sharing Here are three more guidelines about onboarding : Optimization Optimization is crucial. It is not only about meeting Meta VRC requirements so your app can ship, but also about player comfort and enjoyment. Below is a simple optimization workflow that works well even for beginners and usually resolves most common issues. Step 1: Check your project settings Many performance problems come from incorrect project setup. Use the Project Setup Tool from the Meta SDK Ensure Single Pass Instanced is enabled Review URP settings and quality levels Bake your lights whenever possible For a full setup checklist, see this video: https://youtu.be/BeB9Cx_msKA?si=AjfnZdoPxH3jPxk- Project Validation Tool in Action Step 2: Check your scene complexity Triangle budgets vary depending on your content, but a good general target is around 150,000 triangles visible at once. Draw calls are just as important, and often more critical so keep them as low as possible (under 80). Ways to reduce complexity: Lower triangle count per model Use frustum and occlusion culling Add LODs to complex meshes Use batching techniques where possible For more information about the target fps, drawcalls and triangle count, go check out this meta documentation page: https://developers.meta.com/horizon/documentation/unity/unity-perf Step 3: Use OVR Metrics Install OVR Metrics and always test outside the Unity editor. OVR Metrics gives you real performance data on device, with live graphs and indicators. Here are the three most important data to track : FPS GPU and CPU usage Render Scale (must stay above 0.85 to meet VRC requirements) You can download it here : https://www.meta.com/en-gb/experiences/ovr-metrics-tool/2372625889463779/ Play your build and look for frame drops, reduced render scale, or sustained high GPU or CPU usage. Step 4: Find and remove your bottleneck The previous step tells you when and where performance drops occur. Now you need to understand why. Use the right tools: Unity Profiler Identify CPU bottlenecks, scripts taking too long, physics spikes, or garbage collection issues. Frame Debugger Analyze draw calls and rendering passes to understand what is actually being rendered each frame. Meta Runtime Optimizer Helps identify XR specific performance issues related to the runtime and rendering pipeline. Once you know the bottleneck, you can make targeted changes instead of guessing. Step 5: Last resort performance boosts When time is very limited, two features can significantly improve performance with minimal effort: Dynamic Resolution Fixed Foveated Rendering Both offer configurable levels to balance performance and visual quality. Be careful when lowering values too aggressively, visual quality can degrade quickly. Always remember that Render Scale must stay above 85 percent to pass VRC. Fixed Foveated Rendering Applied to Eye Texture Game Feel Game feel is often underestimated, but small improvements here can dramatically improve how polished your XR experience feels. Moreover having responsiveness also helps guide the player to understand your game and help onboarding. Haptic feedback Haptics add physical feedback to interactions and are extremely effective in XR. Triggering haptics is often just a single line of code. If you want more advanced effects, you can build layered patterns using a haptic tool or studio. Tutorial: https://youtu.be/RUUwWMkXFt0?si=1L92-NwIL9xy4CfJ Grab poses Hand grab poses let you enforce a custom hand pose when grabbing an object. The pose can vary depending on how or where the object is grabbed. For example, a cup grabbed by the body uses a different pose than grabbing it by the handle. This small detail greatly improves realism and comfort. Documentation: https://developers.meta.com/horizon/documentation/unity/unity-isdk-creating-handgrab-poses/ Sound design Sound is often forgotten, yet incredibly important. Add sounds to every interaction Use pitch variation for natural randomness Adjust volume based on the intensity of the action The Meta All in One SDK already includes UI focused audio, but Meta also provides a free audio pack with more general sounds: https://developers.meta.com/horizon/downloads/package/oculus-audio-pack-1/ Outro So here it is guys, these are the most impactful steps you can take to polish an XR title when time is limited. I hope this breakdown is helpful. Feel free to share your own tips and tricks below on how you polish your XR titles.50Views1like0CommentsRuntime Optimizer Part 3 - AI-Assisted Performance Optimization on Meta Quest | Performance Series
Who says performance optimization has to live behind expert-only tooling? In this Start Performance Series session, Meta software engineers Jay Hsia and Nico Lopez show you how the Meta Quest Runtime Optimizer pairs with Perfetto and large language models (LLMs) to turn dense performance data into clear, actionable insights. You’ll also see how Unity developers can use the Runtime Optimizer for GPU analysis, then use Perfetto traces with an LLM to surface bottlenecks and generate specific fixes you can test right away. 💡 By viewing this session, you’ll learn: How to use Runtime Optimizer to profile GPU performance in Unity, including deep captures and “what-if” analysis How to capture rich Perfetto traces and use them for system-level performance profiling How to use an LLM to turn trace data into plain-English bottlenecks and concrete fix ideas How to validate improvements with a repeatable before-and-after trace comparison loop Recorded on December 4, 2025 as part of the Meta Horizon Start program. 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - What the Runtime Optimizer does and why it matters ⚙️ RUNTIME OPTIMIZER UPDATES 🕒 03:21 - What’s new in Runtime Optimizer 0.2.2 🕒 03:35 - Hierarchy view for GPU cost analysis 🕒 04:14 - Batch profiling game objects 🤖 AI MEETS PERFORMANCE PROFILING 🕒 04:39 - Why Perfetto changes the optimization workflow 🕒 05:32 - Where AI fits into performance analysis 🕒 06:02 - What an MCP adds to LLM reliability 🧪 LIVE DEMO: PERFETTO AND LLM ANALYSIS 🕒 08:42 - Capturing a Perfetto trace 🕒 09:56 - Reading frame breakdowns with an LLM 🕒 10:38 - Detecting anomalies and GPU bottlenecks 🛠 FROM INSIGHT TO FIX 🕒 11:15 - Turning analysis into actionable changes 🕒 12:35 - Applying suggested code changes 🕒 13:02 - Tuning Unity settings with performance context 🔁 THE NORTH STAR OPTIMIZATION LOOP 🕒 16:05 - Runtime Optimizer, Perfetto, and Immersive Debugger together 🕒 17:27 - Running before and after trace comparisons 🕒 18:50 - Measuring real improvements ✅ BEST PRACTICES AND TAKEAWAYS 🕒 19:28 - How to get reliable results from AI-assisted profiling 🕒 20:06 - Why profiling markers matter 🕒 21:00 - Managing tokens and context for better AI results 🕒 22:15 - Applying AI across your profiling toolchain 🧰 TOOLS REFERENCED Perfetto: https://perfetto.dev/ Unity Profiler: https://docs.unity3d.com/Manual/Profiler.html Cursor: https://cursor.com/ RenderDoc: https://renderdoc.org/ 📚 RESOURCES ➡️ Fix Performance Bottlenecks with Meta Quest Runtime Optimizer: https://communityforums.atmeta.com/discussions/Community_Resources/fix-performance-bottlenecks-with-the-meta-quest-runtime-optimizer--performance-s/1356116 ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
17Views0likes0CommentsPassthrough Camera Access (PCA) + AI Building Blocks on Quest 3 | Developer Workshop
With Passthrough Camera Access (PCA), your Quest 3 app reads the real-world camera feed as GPU textures you can use in Unity for mixed reality. In this session, the Meta XR team walks through the new PCA component in the Meta XR SDK and its privacy-first design that replaces the internal Medici prototype. You’ll see how PCA connects with AI Building Blocks for object detection and guidance, and how to use these tools in your submission to the 2025 Meta Horizon Start Developer Competition. 💡 By viewing this session, you’ll learn how to: Set up the PassthroughCameraAccess component in Unity and choose the right permissions and resolution for your use case. Map 2D coordinates from the camera to 3D rays with depth so virtual content stays locked to real surfaces. Use AI Building Blocks for on-device object detection and LLM prompts that respond to what the camera sees. Plan a PCA-powered prototype that meets the requirements of the Meta Horizon Start Developer Competition. 🎬 CHAPTERS 👋 INTRODUCTION & OVERVIEW 🕒 00:00 - Introduction and Agenda 🕒 00:52- What Is Passthrough Camera Access (PCA)? 🕒 03:20- Inspiring Use Cases: Industry & Community Projects 🛠️ TECHNICAL IMPLEMENTATION 🕒 07:29 - Technical Deep Dive: The New PCA Component in SDK v81 🕒 12:10 - New & Upcoming PCA Features 🕒 15:52 - Mastering Coordinate Systems: 2D to 3D and 3D to 2D 🤖 AI INTEGRATION & OPTIMIZATION 🕒 19:47 - Introducing the AI Building Blocks 🕒 22:51 - Deep Dive: The Object Detection Building Block ✅ Q&A AND BEST PRACTICES 🕒 24:46 - Competition Details & Q&A Kick-off 🕒 28:14 - Q&A: Handling Latency and Inspiring Use Cases 🕒 40:44 - Q&A: PCA vs. Scene API & Visual Design Best Practices 🕒 45:46 - Q&A: Offline Voice Recognition & Final Remarks 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
64Views0likes0CommentsRuntime Optimizer Part 2 - GPU Textures and Mixed Reality | Performance Series
In Part 2 of our technical deep dive, Meta software engineers Jay Hsia and Nico Lopez move from high-level profiling into real GPU problem solving for mixed reality on Meta Quest. You’ll see how they use the Runtime Optimizer to study Passthrough projects and large texture collections, then turn that data into shader changes and day-to-day profiling routines that keep performance predictable. 💡After viewing this session, you’ll understand how to: Configure Runtime Optimizer experiments for Passthrough and mixed reality projects. Choose between Texture 2D Array and atlases for large texture sets. Stream high resolution textures in smaller chunks. Move UV work into the vertex shader and add profiler markers. 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Welcome & Agenda Overview 🛠️ DEBUGGING & MIXED REALITY 🕒 03:19 - Profiling Passthrough & Mixed Reality 🕒 06:24 - Texture Arrays vs. Texture Atlases 🎨 Shader & Code Optimization 🕒 12:52 - Fixing Texture Fetch Stalls (Vertex Shaders) 🕒 16:52 - Best Practices for Profiler Markers ✅ FINAL THOUGHTS 🕒 20:20 - Recap & Next Steps 📖 OPTIMIZATION EXAMPLES REFERENCED IN THIS VIDEO 🔖 Meta Quest Runtime Optimizer Docs: https://developers.meta.com/horizon/documentation/unity/unity-quest-runtime-optimizer/ 🔖 Unity Profiler Documentation: https://docs.unity3d.com/Manual/Profiler.html 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
13Views0likes0CommentsFix Performance Bottlenecks with the Meta Quest Runtime Optimizer | Performance Series
In this technical deep dive, Meta Software Engineers Jay Hsia and Nico Lopez walk you through the complete workflow for identifying and resolving latency issues in real-time using the Meta Quest Runtime Optimizer. If you’re looking to optimize your build, this session is an essential resource for fixing performance bottlenecks prior to launch. 💡After viewing this session, you’ll understand how to: Enable and configure Runtime Optimizer overlay in Unity Interpret real-time metrics to distinguish between CPU and GPU bottlenecks Apply actionable insights to reduce draw calls and texture overhead 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Introduction to the Performance Series & Runtime Optimizer 🕒 01:52 - The Optimization Profiling Funnel 🛠️ RUNTIME OPTIMIZER SETUP 🕒 05:31 - Core Concepts of the Runtime Optimizer 📉 ANALYZING METRICS 🕒 09:32 - Bottleneck Analysis: Finding What's Expensive 🕒 15:27 - What-If Analysis: Quantifying Performance Costs 🛠️ LIVE DEMO & WORKFLOW 🕒 20:51 - Live Demo: Putting the Tool into Practice 🕒 26:12 - Final Recap and Recommended Workflow 📖 OPTIMIZATION EXAMPLES REFERENCED IN THIS VIDEO 🔖 Meta Quest Runtime Optimizer Docs: https://developers.meta.com/horizon/documentation/unity/unity-quest-runtime-optimizer/ 🔖 Unity Profiler Documentation: https://docs.unity3d.com/Manual/Profiler.html 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
50Views0likes0CommentsBuild Intuitive Experiences with Hands & Microgestures
You can build powerful immersive experiences with Hand Tracking using the Meta Interaction SDK. Watch as Meta Engineering Manager Jesse Keogh shows you the essential steps to go from a blank Unity project to a fully interactive prototype with UI and 3D object interactions. You’ll walk away with production-ready techniques for Quest-specific optimization and system-backed gestures like Pinch and Microgestures that deliver reliable input across all users. 💡After viewing this session, you’ll understand how to: Configure the Interaction SDK using Quick Actions Implement system-backed gestures like Pinch Streamline project creation with the Meta Quest Developer Hub Test interactions using the XR Simulator 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Introduction to the Hands Workshop 🕒 02:41 - Overview of Hand Tracking System Features ⚙️ CORE CONCEPTS 🕒 05:35 - System-Backed Gestures & Interaction SDK 🕒 08:25 - Tour of Interaction SDK Capabilities 🕒 11:48 - Design Principles & Best Practices 🛠️ Workflow & Demo 🕒 14:23 - Project Setup with Meta Quest Developer Hub 🕒 20:39 - Live Demo: Building UI Interactions 🕒 34:02 - Live Demo: Adding Grabbable 3D Objects ✅ FINAL THOUGHTS 🕒 37:15 - Q&A: Advanced Use Cases & Debugging 📖 HAND TRACKING EXAMPLES REFERENCED IN THIS VIDEO 🔖 Interaction SDK Overview: https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ 🔖 Hand Tracking Design Guidelines: https://developers.meta.com/horizon/design/hands/ 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
18Views0likes0CommentsHow to Understand your Player and their Space : Context Aware MR
Hey folks! I just ran last week a small workshop about Context-Aware MR on how to understand your player and their space. I wanted to drop a summary here, and I’m looking forward to your tips, ideas, or comments. 😊 But, here is the big question. What are the tool we can use to really “understand” a player? Here are the four musketeers of Context Aware MR : Player Input, MRUK, Depth API, PCA. 1) Player Input You might think player input is a bit trivial but there’s actually a lot we can extract from it to understand our player behaviour. Of course, we get the basics: the position and rotation of the head and controllers, plus input from buttons or triggers. But that’s just the beginning. With features like hand tracking and voice recognition, we unlock much more. Hand tracking lets us detect custom hand poses or gestures. Voice recognition allows for voice commands and even detecting voice loudness, which can be used to animate a character’s mouth or trigger actions like blowing out a candle. By combining head and controller tracking, we can figure out not only where the player is looking but also estimate their overall body pose. These are just a few examples, but the creative possibilities are huge. While these features also apply to standard VR, let’s now move to tools that are specific to Mixed Reality starting with our next musketeer : MRUK ! 2) MRUK To understand MRUK I need to first explain what the Scene Data is. The Scene Data is an approximation of the player's environment, set up outside of your app through the Meta system. It gives you access to either a full mesh or a simplified version of the room using labeled boxes that lets you identify if this elements is a wall, door, floors or a furniture. The Mixed Reality Utility Kit (MRUK) is a powerful set of tools built on top of Scene Data. It helps you place, align, and make virtual content interact with the real world. Here are some examples of what MRUK enables: Smart spawn points on specific surfaces (like the floor or walls) while avoiding obstacles (like furniture) Collision for your virtual content Navmesh to move object around the player’s space without bumping into real-world elements Destructible scene mesh effects Dynamic lighting effects on real world QR code and keyboard tracking And more... While MRUK is incredibly useful, keep in mind that Scene Data doesn’t update in real time. That’s where the Depth API comes in. 3) Depth API The Depth API gives you real-time depth maps of what the user is currently seeing. This allows you to occlude virtual objects behind real-world elements, making them feel like a natural part of the environment and greatly increasing immersion. It also comes with a Depth Raycast Manager, which lets you detect collisions at runtime with real objects perfect for dynamic content placement or interactions. It’s a great complement to the Scene Model, filling in the gaps that static scene data can’t cover. Despite its potential, it's still underused in many XR projects. 4) Passthrough Camera Access (PCA) We’ve had the first three tools for a while now. But recently, a game-changing feature was introduced: access to the passthrough camera! With access to the camera, you can: Read the live image as a texture to do color picking, light estimation, or apply visual effects like blur Feed the image to AI models for computer vision tasks like object detection It opens a direct bridge between the real world and AI and that's huge for MR development. Good news: starting with version v83, new building blocks are available to help you set up PCA easily in your project. To Conclude Player Input, MRUK, Depth API, and Passthrough Camera Access form a powerful toolbox for building context-aware MR experiences. And now, with tools like PCA, creativity is more accessible than ever. We can finally build apps and games that truly adapt to each user and their real space. Hope you enjoyed this little summary a nd that you learned something new along the way. Go check out the different link provided in the post if you want to learn more about our 4 musqueteers and if you have a tip on how you these features in your app share them down bellow! 😊 Usefuls link : https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview https://developers.meta.com/horizon/documentation/unity/unity-mr-utility-kit-overview https://developers.meta.com/horizon/documentation/unity/unity-depthapi-overview https://developers.meta.com/horizon/documentation/unity/unity-pca-overview Have a nice day! 👋57Views1like0CommentsThe Complete List of Sample Unity VR Projects
Hey guys, I wanted to put together a list of my favorite sample projects that you can grab and learn from. In my opinion, these projects are pure goldmines, they don’t just showcase design principles around specific features but also provide direct examples of how to use them, which is especially important right now for something like a hackathon. For an even larger collection of Meta samples, see the GitHub list of all Meta sample repos here: https://github.com/orgs/oculus-samples/repositories?type=all Let’s start with our first category, the interaction samples. Interaction Samples Meta XR All-In-One (Interaction SDK) Sample Links: https://github.com/oculus-samples/Unity-InteractionSDK-Samples https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 Description: A comprehensive demo from Meta’s XR Interaction SDK featuring core VR interactions like poking, grabbing, raycasting, UI, and locomotion, all working together. Perfect to understand how to integrate both hands and controllers in one system. First Hand Link: https://github.com/oculus-samples/Unity-FirstHand Description: A full VR game demo focused on hand-tracked interactions. It showcases a complete Unity experience using the Interaction SDK with hand tracking as the main input and controller fallback. XR Interaction Toolkit Examples Link: https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples Description: Unity’s official XR Interaction Toolkit samples showing how to implement locomotion, selection, grabbing, and UI interactions. A solid starting point for setting up XR Origin and interactor/interactable components. Move Fast Link: https://github.com/oculus-samples/Unity-MoveFast Description: A fast-paced VR fitness demo using hand tracking and the Interaction SDK. The sample shows how to build an energetic workout game with responsive, punch-based interactions. Whisperer Link: https://github.com/oculus-samples/voicesdk-samples-whisperer Description: A voice-controlled VR experience demonstrating the Meta Voice SDK. Use voice commands as part of gameplay to learn how to integrate real-time voice recognition into your own projects. Tilt Brush (Open Brush) Link: https://github.com/icosa-foundation/open-brush Description: An open-source continuation of Google’s Tilt Brush. Lets users paint and sculpt in 3D space — an excellent reference for creative VR tools and spatial drawing. Multiplayer & Social Samples VR Multiplayer Sample (Unity XRI) Link: https://docs.unity3d.com/Packages/com.unity.template.vr-multiplayer@2.0/manual/index.html Description: Unity’s official multiplayer VR template featuring a prebuilt scene, avatars, and networking setup using Netcode for GameObjects. Great for learning multi-user interactions in VR. Mixed Reality Multiplayer (XR Multiplayer) Sample Link: https://docs.unity3d.com/Packages/com.unity.template.mr-multiplayer@1.0/manual/index.html Description: A tabletop MR multiplayer demo that includes avatars, voice chat, and shared AR/VR spaces. Features games like balloon slingshot and chess while teaching MR networking and colocation concepts. Tiny Golf Link: https://github.com/Meta-Horizon-Start-Program/Tiny-Golf Description: A free-to-play multiplayer mini-golf VR game created for the Meta Start program. Demonstrates basic physics, scoring, and networked multiplayer. Ultimate Glove Ball Link: https://github.com/oculus-samples/Unity-UltimateGloveBall Description: A VR e-sport showcase demonstrating multiplayer, avatars, voice, and in-app purchases. Integrates Photon networking and Oculus social APIs, making it a great reference for social competitive games. Spirit Sling Link: https://github.com/oculus-samples/Unity-SpiritSling Description: A social MR tabletop game letting players place a shared game board in real space and invite friends to join. Highlights Avatars SDK and MR colocated play. Decommissioned Link: https://github.com/oculus-samples/Unity-Decommissioned Description: A social-deduction VR game inspired by titles like Among Us. Shows how to handle multiplayer lobbies, Oculus invites, and social APIs in a networked Unity project. Mixed Reality (MR) Samples A World Beyond (Presence Platform Demo) Link: https://github.com/oculus-samples/Unity-TheWorldBeyond Description: A full MR showcase combining Scene Understanding, Passthrough, hand tracking, voice input, and spatial audio. A must-see for developers building immersive MR scenes blending real and virtual spaces. Phanto (MR Reference App) Links: https://github.com/oculus-samples/Unity-Phanto https://developers.meta.com/horizon/blog/phanto-unreal-showcase/ Description: An MR reference app focused on environmental awareness. Uses the Scene Mesh and MR APIs to blend gameplay with real-world geometry. Unity Discover (featuring Drone Rage and others) Links: https://www.meta.com/en-gb/experiences/discover/7041851792509764/ https://github.com/oculus-samples/Unity-Discover Description: A collection of MR showcase mini-experiences like Drone Rage. Demonstrates MR features including Passthrough, Spatial Anchors, and Shared Anchors in various game prototypes. MR Motifs Link: https://github.com/oculus-samples/Unity-MRMotifs Description: A library of MR “motifs”, small, reusable templates showcasing mechanics such as passthrough transitions, colocated multiplayer, and instant content placement. Cryptic Cabinet Link: https://github.com/oculus-samples/Unity-CrypticCabinet Description: A short MR escape-room experience that adapts to your room’s layout. Demonstrates interactive storytelling in mixed reality using environmental awareness. Passthrough Camera API Samples Link: https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples Description: A sample project demonstrating how to access and process Quest’s Passthrough camera feed for effects, object detection, and image manipulation. Tool and Utilities Asset Streaming Link: https://github.com/oculus-samples/Unity-AssetStreaming Description: An open-world streaming sample that shows how to dynamically load content using Addressables and LOD systems — ideal for maintaining performance in large VR environments. Shader Prewarmer Link: https://github.com/oculus-samples/Unity-ShaderPrewarmer Description: A utility sample that preloads shader variants at startup to eliminate hitching or stutters when shaders first compile — an important optimization for smooth VR performance. Complete Game Showcase Northstar Link: https://github.com/oculus-samples/Unity-NorthStar Description: A complete VR game showcasing advanced interaction and visual technique for VR. Featuring rope physics, narrative story telling, lip sync and more.337Views6likes2CommentsVR Optimization in Unity - A Complete List of Tools and Resources
Hey guys, I wanted to put together a list of tools and resources for optimizing a VR game in Unity. As the process can be tricky (profiling, debugging, etc.) and there are a lot of different tools, I hope this helps show the different use cases and workflows. This list is a work in progress, so if you want to add your favorites or ask questions, please do it here. :) OVERVIEW OF OPTIMIZATION TOOLS In-Editor Stats Window Documentation: https://docs.unity3d.com/Manual/RenderingStatistics.html Shows real time rendering statistics (triangles, draw calls, batches) in the Game View to quickly spot bottlenecks. Profiler Documentation: https://docs.unity3d.com/Manual/Profiler.html Records CPU, GPU, memory, and other subsystem usage to identify hotspot frames or assets. Project Validation and Project Setup Tool Documentation: https://developers.meta.com/horizon/documentation/unity/unity-upst-overview/ Test a registry of rules called Configuration Tasks (xr setup and optimization) and provide default rules to make your project Meta Quest ready. Quest Runtime Optimizer https://developers.meta.com/horizon/documentation/unity/unity-quest-runtime-optimizer/ Provides real time analysis and actionable insights to optimize performance on device. Unity Project Auditor https://docs.unity3d.com/Packages/com.unity.project-auditor@1.0/manual/index.html Audits your Unity project for performance and best practice compliance. Frame Debugger Documentation: https://docs.unity3d.com/Manual/FrameDebugger.html Step through draw calls and frame rendering to find overdraw or inefficient passes. Auto VR Optimizer https://assetstore.unity.com/packages/tools/utilities/auto-vr-optimizer-318687 Automated utility to apply common VR optimization settings to your Unity project. Memory Profiler and Profile Analyzer Documentation: https://unity.com/how-to/use-memory-profiling-unity Capture memory snapshots, compare them, and analyze fragmentation or leaks. Outside Editor OVR Metrics Tool https://developers.meta.com/horizon/downloads/package/ovr-metrics-tool/ Capture performance metrics on device, including CPU, GPU, and memory. Meta Quest Developer Hub https://developers.meta.com/horizon/downloads/package/oculus-developer-hub-win/ Deploy to device, profile, view logs, and measure thermal and throttling behavior. RenderDoc Documentation: https://developers.meta.com/horizon/documentation/unity/ts-renderdoc-for-oculus/ Graphics frame capture and inspection of draw calls, states, and textures. Perfetto Guide: https://developers.meta.com/horizon/documentation/unity/ts-perfettoguide Open source system tracing for CPU, GPU, and OS events. Immersive Debugger https://developers.meta.com/horizon/documentation/unity/immersivedebugger-overview/ Resources Profiling and debugging tools in Unity https://unity.com/how-to/profiling-and-debugging-tools Ultimate Guide to Profiling Unity Games (e book) https://unity.com/resources/ultimate-guide-to-profiling-unity-games-unity-6 Meta XR Performance Best Practices https://developers.meta.com/horizon/documentation/unity/unity-best-practices-intro Optimization for web, XR, and mobile games in Unity 6 (YouTube) https://youtu.be/2J0kDtUGlrY?si=DfiogFWQronhdowQ Unity Optimization E book (Unity 6 edition) https://unity.com/resources/mobile-xr-web-game-performance-optimization-unity-6 Valem Tutorials How to Optimize VR Game Part 1 https://www.youtube.com/watch?v=BeB9Cx_msKA How to Optimize VR Game Part 2 https://www.youtube.com/watch?v=Jgf3F--VoPg How to Optimize VR Game Part 3 https://youtu.be/qm4-6zHkanM Hope this helps. :) What do you think? Do you have other tools or resources that should be on this list?238Views2likes0Comments