Retention in Social VR: Why Most Indie VR Games Struggle (And How to Fix It)
So, you built your game. It is FUN. You drive players daily. But most do not come back? When we, indie VR developers, talk about growth, we usually talk about: TikTok / YouTube Store visibility Influencers Launch day spikes But here’s the uncomfortable truth: | Growth without retention is just expensive churn. I had the same issue...And over time, I realized something important: Retention in Social VR is not a metric. It is a campfire. Retention is a Campfire 🔥 Think about a campfire. People gather around it because: It is warm It is safe It is alive It gives them a reason to stay If you stop feeding it, it dies. If there is no structure around it, it spreads and burns out. If it’s too small, nobody gathers. Retention works the same way. Social VR Is Not Just a Game — It’s a Place In traditional games, players chase wins. In social VR, players chase: Belonging Visibility Identity Status They are not looking for “levels.”-They are looking for warmth. Retention in VR is driven by social capital, not just progression. The Retention Triangle (How You Feed the Fire) Over time, I simplified retention into three pillars, which are the logs you place into the fire. 1️⃣ Direction (Quests & Challenges) When a player logs in, they should immediately know: “What do I do next?” If your spawn area says nothing…The fire weakens. Direction means: Clear daily goals Short-term progress Visible next steps Daily challenges are not “gamification.” They are fuel. Without small logs, the fire fades. 2️⃣ Identity (Badges & Visibility) Badges should not track progress. They should broadcast who the player is. In social VR: Cosmetic visibility > invisible XP Titles > hidden levels Social proof > private stats When players can show who they are, they protect their place in the circle. Identity makes the fire meaningful. Without identity, it’s just heat. 3️⃣ Rhythm (Live Ops & Weekly Anchors) Retention dies when time feels flat. A strong campfire needs regular fuel. Strong social VR games create rhythm: Daily resets Weekly rotations Limited-time cosmetics Community events Rhythm keeps the fire alive. A game without rhythm feels abandoned — even if it isn’t. The Stone Circle (Systems & Moderation) A campfire without stones spreads. A community without structure collapses. One of the biggest mistakes indie VR developers make: Moderators added too late Events built too late Systems implemented after chaos Unmoderated growth = chaos. Chaos = churn. Systems protect retention. The Social Layer Multiplier Here is where it gets powerful. When players see: Creator tags Event hosts Ranked titles Rare cosmetic holders They don’t just play. They aspire to sit closer to the fire. Aspiration strengthens the circle. And that is when retention becomes natural. What I Changed in My Own Game In my own social VR game, I realized: | Fun mechanics were not enough. So I focused on strengthening the fire: Visible daily challenge boards Public tech tree branches Social nameplate titles Weekly mode rotations Creator spotlight systems Not more content, but more fuel. Retention Is a Design Philosophy You don’t fix retention with a patch. You design your world like a place people want to gather around. You build: Direction Identity Rhythm When those three align, players don’t just visit your game. They sit down. They stay. They return. Final Thought When a player logs out today… Does your campfire still feel warm? Or does it go dark? If the answer is not obvious, your retention system isn’t strong enough yet. — Tevfik Ufuk Demirbaş VR Entrepreneur & Developer & Start Mentor95Views0likes2CommentsBuilding the Foundation of a Social VR Game — Movement & Multiplayer in Baby VR Episode 2
So, I’m building Baby VR — a social VR game that I’m developing with the community on YouTube. In the first episode, I talked about where the idea came from and what kind of social sandbox I want Baby VR to become. But before we touch things like game modes, cosmetics, progression, or monetization, there’s something much more important we need to build first: The foundation. Movement and multiplayer. If these two systems aren’t solid, nothing else in a Social VR game really matters. So in this post, I’ll walk you through how I’m structuring Baby VR’s technical core — from modular architecture to Gorilla Tag–style locomotion and real-time networking using Photon Fusion. Let’s dive in. WHY FOUNDATION COMES FIRST In Social VR, players don’t just control a character — they are the character. Every movement, every hand gesture, every head turn becomes part of how other players perceive you. That means two things need to feel perfect from day one: * Locomotion — how you move through the world * Networking — how other players see you move in real time Everything else — game modes, progression, stores, and live events — is built on top of these systems. THE MODULAR ARCHITECTURE (FLAMECORE) Baby VR is built using a modular system I call FlameCore. Instead of one giant, tangled codebase, the game is split into independent modules, where each one does a single job. This makes the project: * Easier to build * Easier to maintain * Easier to scale over time Think of it like LEGO blocks. You can swap, upgrade, or rebuild parts without breaking the whole system. THE 5 CORE MODULES Here’s the high-level structure: * Core Module: Locomotion, VR Interactions, player rigs *Networking Module: Multiplayer sync, player authority, sessions * Gameplay Module: Game modes, rules, sandbox logic * Backend Module: Economy, store, platform services * LiveOps Module: Challenges, analytics, moderation, live events Social VR isn’t just a “game” — it’s a live service, which is why LiveOps is treated as a first-class system, not an afterthought. LOCOMOTION — MOVING LIKE A PLAYER, NOT A CONTROLLER Inside the Core Module, Baby VR uses Gorilla Tag–style arm locomotion.This system is based on an open-source project from the creator of Gorilla Tag, and it changes how players relate to the world: You don’t press a joystick to move You push against the environment You climb, swing, and launch yourself physically This creates what I like to call somatic progression — players don’t “level up” their character. They level up their own coordination and skill. For a social sandbox, this kind of movement makes every interaction feel more personal and more expressive. MULTIPLAYER — HOW SOCIAL VR ACTUALLY WORKS Every player in Baby VR exists as two representations: HardwareRig (Local Player) This is the version of you that lives only on your computer. It reads directly from your VR hardware: headset position, controller positions, and hand movement. Its job is simple: read reality. NetworkRig (Online Avatar) This is the version of you that everyone else sees. It syncs your movement across the network and represents you inside the multiplayer session. Its job is also simple: show reality to others. THE DATA FLOW Here’s the full loop: * You join a Photon Session (the game room) * Your HardwareRig reads your VR hardware * That data is packaged into custom network data * Your NetworkRig sends it to the session * Photon Cloud broadcasts it to all players * Other clients update your avatar in real time The key concept here is State Authority. You only control your own NetworkRig. Other players’ NetworkRigs are read-only on your machine. This prevents conflicts, keeps the simulation stable, and ensures everyone stays in sync. Your head and hands follow the exact same logic — they’re just smaller versions of the same pipeline. META STORE RELEASE CHANNEL To make this truly community-driven, I’ve opened a Meta Store Release Channel. Join here. This lets you access early builds, test experimental features, and see Baby VR evolve step by step. If you join, you’re not just a player — you’re an OG tester helping shape the game from the ground up. CLOSING THOUGHTS This episode — and this post — is about one thing: Understanding how Social VR actually works under the hood. Movement and multiplayer aren’t just features. They’re the language players use to express themselves in virtual worlds. In the next episode, I’ll start building on top of this foundation — VR interactions, sandbox systems, and game modes designed for controlled chaos. If you want your ideas to turn into features, leave a comment, join the builds, and be part of the process. Let’s build Baby VR — together. 👶🚀 -Tevfik63Views0likes0CommentsBuilding a Social VR Game From Scratch Part 1: Entitlement
So, I am building Baby VR, a social VR game that I will build with the community on YouTube. While planning the curriculum, I realized that before working on the core things like Networking, Voice Chat or Game Mechanics, we need to first integrate Baby VR into the Meta Horizon Store. And it starts with the Entitlement. So, in this blog post, I will show you how I did the Entitlement for the Meta Horizon Store. Let's get started. Introduction If you're building a VR app for Meta Quest, you absolutely need to implement entitlement checking. There's no way around it. Without it, anyone could potentially access your app without actually purchasing it from the Meta Quest Store. Think of entitlement as your app's bouncer - it checks if someone actually paid to get in before letting them through the door. Meta requires entitlement checks for apps published on their store, and it's really not optional if you want to protect your work and ensure users have legitimately obtained your application. According to Meta's official documentation. In this blog post, I'll walk you through a real-world implementation that handles all the edge cases - retry logic, error handling, and proper user data retrieval. Let's dive in. How It Works: The Complete Flow Before we get into the code, here's the big picture of how the entitlement process flows: The system consists of a few key components working together=> MetaStoreManager - The main orchestrator that kicks everything off EntitlementHandler - Does the heavy lifting of verification Event System - Notifies other parts of your game when entitlement completes MetaPlayerData - Stores the user info we retrieve Step-by-Step Implementation 1. The MetaStoreManager: Your Entry Point The `MetaStoreManager` is a Unity `MonoBehaviour` that orchestrates everything. It's simple - it initializes the entitlement handler and listens for when the entitlement completes: When you call `Initialize()`, it kicks off the entitlement process. Once complete, it stores the player data for use throughout your game. 2. The EntitlementHandler: The Core Logic This is where the real work happens. The handler performs a four-step verification process with automatic retry logic (up to 3 attempts with 2-second delays between retries): The `CheckEntitlement()` method runs four critical steps in sequence - if any step fails, the whole process fails and retries: Step 2 is the critical one- `CheckUserEntitlement()` calls `Entitlements.IsUserEntitledToApplication()` which queries Meta's servers to verify the user actually purchased your app. This is where the piracy protection happens. The other steps retrieve user data (ID, display name, Oculus ID) and generate a cryptographic proof (nonce) that you can use for server-side verification later. 3. The Data Structure After successful entitlement, you get a `MetaPlayerData` object containing: public class MetaPlayerData { public string UserId; // Unique user identifier public string UserName; // Display name public string AliasName; // Oculus ID public string OculusNonce; // Cryptographic proof for server verification } The`OculusNonce` is particularly important - it's a proof token you can send to your backend server to verify the user's identity securely. Best Practices When to check: Run entitlement as early as possible - ideally during your splash screen or initial loading. Don't let users access premium features until verification completes. Error handling: The implementation includes automatic retry logic (3 attempts with 2-second delays), but you should also show user-friendly error messages and provide a manual retry option if all attempts fail. Security: Never trust client-side verification alone. Always use the `OculusNonce` to verify user identity on your backend server for critical features. This prevents tampering and ensures real security. Performance: The async/await pattern keeps everything non-blocking, so your game stays responsive during the verification process. Common Issues and Solutions Entitlement always fails? Make sure your app is properly configured in the Meta Developer Dashboard, and test on a device that has actually purchased the app. Network issues can also cause failures. Platform not initializing? Verify the Oculus Platform SDK is properly imported and check your AndroidManifest.xml for required permissions. Also ensure you're testing on actual Quest hardware. User data not retrieved? The user needs to be logged into their Oculus account, and privacy settings might be blocking access. Check both the device settings and ensure you're using a compatible SDK version. Quick Integration Example Here's the basic pattern for using this in your game: Conclusion Meta Store entitlement isn't optional - it's a requirement for protecting your VR application. The implementation we've covered gives you: - ✅ Robust verification with automatic retry logic - ✅ Complete user data retrieval for personalization - ✅ Event-based architecture that keeps your code clean - ✅ Production-ready error handling Remember to test on actual Quest hardware, verify your app configuration in the Meta Developer Dashboard, and always implement server-side verification using the `OculusNonce` for critical features. This system provides a solid foundation that protects your app while keeping the user experience smooth. The retry logic handles network hiccups, and the event system keeps everything decoupled and maintainable. Let me know if you need the source code. Additional Resources Meta's Official Entitlement Check Documentation *This blog post is based on a production implementation. Always refer to the latest Meta documentation for the most up-to-date information and best practices.*62Views3likes0CommentsHow to Polish your XR Title : Onboarding, Optimization and Game Feel
Hi guys, I recently ran a workshop for the Meta Horizon Start Competition focused on polishing an XR title. At that stage of the hackathon, with only a week left before submission, one question mattered more than any other: How do you make the biggest impact in the least amount of time? This post is a written summary of that workshop aiming at answering this question around three topics: Player onboarding Optimization Game feel But why these three? Because when you are close to shipping, it becomes critical to look at your application through the eyes of a player, not a developer. For example : The build can behave very differently on device compared to the editor, which makes optimization and debugging essential. If players do not understand what to do, onboarding needs improvement. If it works but feels flat or unsatisfying, game feel needs polish. Together, these areas tend to give the highest return on investment when time is limited. Let’s start with player onboarding, using the Meta All in One SDK. Player Onboarding Player onboarding is how you teach players to understand and interact with your game. A useful principle to keep in mind is: The best tutorial is no tutorial. Ideally, players learn by doing. You introduce interactions one at a time, following the player’s natural progression through the experience. That said, refining this approach requires time and playtesting, which is often limited near the end of a project. When time is short, these faster onboarding tools can help communicate essential information clearly. Tooltips Short, spatialized text pointing at an object or interaction. Best used to highlight a short action. UI Panels The most basic solution, using Unity UI to display text, images, or short videos. Ghost Hands Very effective for hand tracking. You can visually demonstrate a hand pose or motion. Tip: With hand tracking, you can click on play, do a certain pose with your own hand, then drag the hand model from the hierarchy to the project window to save it as a prefab that you can reuse and that will directly have the pose you want to show. Voice Prompts Recording short voice lines can be faster than writing UI and often feels more natural. Keep them short and contextual. I provide a Unity package with a sample scene demonstrating each of these approaches here: https://drive.google.com/file/d/1n3IUzLMH6_60foStzgSdR5WignTkWUFh/view?usp=sharing Here are three more guidelines about onboarding : Optimization Optimization is crucial. It is not only about meeting Meta VRC requirements so your app can ship, but also about player comfort and enjoyment. Below is a simple optimization workflow that works well even for beginners and usually resolves most common issues. Step 1: Check your project settings Many performance problems come from incorrect project setup. Use the Project Setup Tool from the Meta SDK Ensure Single Pass Instanced is enabled Review URP settings and quality levels Bake your lights whenever possible For a full setup checklist, see this video: https://youtu.be/BeB9Cx_msKA?si=AjfnZdoPxH3jPxk- Project Validation Tool in Action Step 2: Check your scene complexity Triangle budgets vary depending on your content, but a good general target is around 150,000 triangles visible at once. Draw calls are just as important, and often more critical so keep them as low as possible (under 80). Ways to reduce complexity: Lower triangle count per model Use frustum and occlusion culling Add LODs to complex meshes Use batching techniques where possible For more information about the target fps, drawcalls and triangle count, go check out this meta documentation page: https://developers.meta.com/horizon/documentation/unity/unity-perf Step 3: Use OVR Metrics Install OVR Metrics and always test outside the Unity editor. OVR Metrics gives you real performance data on device, with live graphs and indicators. Here are the three most important data to track : FPS GPU and CPU usage Render Scale (must stay above 0.85 to meet VRC requirements) You can download it here : https://www.meta.com/en-gb/experiences/ovr-metrics-tool/2372625889463779/ Play your build and look for frame drops, reduced render scale, or sustained high GPU or CPU usage. Step 4: Find and remove your bottleneck The previous step tells you when and where performance drops occur. Now you need to understand why. Use the right tools: Unity Profiler Identify CPU bottlenecks, scripts taking too long, physics spikes, or garbage collection issues. Frame Debugger Analyze draw calls and rendering passes to understand what is actually being rendered each frame. Meta Runtime Optimizer Helps identify XR specific performance issues related to the runtime and rendering pipeline. Once you know the bottleneck, you can make targeted changes instead of guessing. Step 5: Last resort performance boosts When time is very limited, two features can significantly improve performance with minimal effort: Dynamic Resolution Fixed Foveated Rendering Both offer configurable levels to balance performance and visual quality. Be careful when lowering values too aggressively, visual quality can degrade quickly. Always remember that Render Scale must stay above 85 percent to pass VRC. Fixed Foveated Rendering Applied to Eye Texture Game Feel Game feel is often underestimated, but small improvements here can dramatically improve how polished your XR experience feels. Moreover having responsiveness also helps guide the player to understand your game and help onboarding. Haptic feedback Haptics add physical feedback to interactions and are extremely effective in XR. Triggering haptics is often just a single line of code. If you want more advanced effects, you can build layered patterns using a haptic tool or studio. Tutorial: https://youtu.be/RUUwWMkXFt0?si=1L92-NwIL9xy4CfJ Grab poses Hand grab poses let you enforce a custom hand pose when grabbing an object. The pose can vary depending on how or where the object is grabbed. For example, a cup grabbed by the body uses a different pose than grabbing it by the handle. This small detail greatly improves realism and comfort. Documentation: https://developers.meta.com/horizon/documentation/unity/unity-isdk-creating-handgrab-poses/ Sound design Sound is often forgotten, yet incredibly important. Add sounds to every interaction Use pitch variation for natural randomness Adjust volume based on the intensity of the action The Meta All in One SDK already includes UI focused audio, but Meta also provides a free audio pack with more general sounds: https://developers.meta.com/horizon/downloads/package/oculus-audio-pack-1/ Outro So here it is guys, these are the most impactful steps you can take to polish an XR title when time is limited. I hope this breakdown is helpful. Feel free to share your own tips and tricks below on how you polish your XR titles.72Views1like0CommentsBuild VR Apps Fast with Immersive Web SDK
Discover the Meta Immersive Web SDK, a powerful “batteries-included” framework designed to make immersive development as approachable as traditional web development. In this session, Meta Software Engineers Felix Zhang and Jianmin Zheng are joined by founder of Drawcall.ai, Bela Bohlender, to deliver a deep dive into the SDK’s features and developer-first workflows. By the end of the session, you will know how to build and test a complete immersive application right in your desktop browser, then carry that same project into a headset when you are ready. 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Intro To Immersive Web SDK And Three Pillars 📈 SETUP AND WORKFLOW 🕒 02:00 - One Command Setup And Project Structure 🕒 04:07 - Headset Free Development With Built In Emulation 🕒 06:52 - Scene Composition And Asset Optimization 📊 CORE ARCHITECTURE AND INTERACTION SYSTEMS 🕒 08:15 - Entity Component System Overview 🕒 14:15 - Input Handling And Locomotion System 🕒 18:50 - Grab Interaction System 🕒 23:24 - Physics System With Havok 💡 UI AND Q&A 🕒 30:31 - Building UI Panels With UI Kit 🕒 38:29 - Audience Q&A And Hackathon Notes 📚 RESOURCES & DOCUMENTATION ➡️ Community Resources: https://communityforums.atmeta.com/category/horizon-developer-forum/discussions/Community_Resources ➡️ Developer Docs: https://developers.meta.com/horizon/develop/ ➡️ Developer Forum: https://communityforums.atmeta.com/category/horizon-developer-forum 🤝 STAY CONNECTED ➡️ News & Announcements: https://communityforums.atmeta.com/category/horizon-developer-forum/discussions/News_and_Announcements ➡️ Developer’s Blog: https://developers.meta.com/horizon/blog 🚀 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
45Views1like0CommentsFix Performance Bottlenecks with the Meta Quest Runtime Optimizer | Performance Series
In this technical deep dive, Meta Software Engineers Jay Hsia and Nico Lopez walk you through the complete workflow for identifying and resolving latency issues in real-time using the Meta Quest Runtime Optimizer. If you’re looking to optimize your build, this session is an essential resource for fixing performance bottlenecks prior to launch. 💡After viewing this session, you’ll understand how to: Enable and configure Runtime Optimizer overlay in Unity Interpret real-time metrics to distinguish between CPU and GPU bottlenecks Apply actionable insights to reduce draw calls and texture overhead 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Introduction to the Performance Series & Runtime Optimizer 🕒 01:52 - The Optimization Profiling Funnel 🛠️ RUNTIME OPTIMIZER SETUP 🕒 05:31 - Core Concepts of the Runtime Optimizer 📉 ANALYZING METRICS 🕒 09:32 - Bottleneck Analysis: Finding What's Expensive 🕒 15:27 - What-If Analysis: Quantifying Performance Costs 🛠️ LIVE DEMO & WORKFLOW 🕒 20:51 - Live Demo: Putting the Tool into Practice 🕒 26:12 - Final Recap and Recommended Workflow 📖 OPTIMIZATION EXAMPLES REFERENCED IN THIS VIDEO 🔖 Meta Quest Runtime Optimizer Docs: https://developers.meta.com/horizon/documentation/unity/unity-quest-runtime-optimizer/ 🔖 Unity Profiler Documentation: https://docs.unity3d.com/Manual/Profiler.html 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
69Views0likes0CommentsBuild Intuitive Experiences with Hands & Microgestures
You can build powerful immersive experiences with Hand Tracking using the Meta Interaction SDK. Watch as Meta Engineering Manager Jesse Keogh shows you the essential steps to go from a blank Unity project to a fully interactive prototype with UI and 3D object interactions. You’ll walk away with production-ready techniques for Quest-specific optimization and system-backed gestures like Pinch and Microgestures that deliver reliable input across all users. 💡After viewing this session, you’ll understand how to: Configure the Interaction SDK using Quick Actions Implement system-backed gestures like Pinch Streamline project creation with the Meta Quest Developer Hub Test interactions using the XR Simulator 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Introduction to the Hands Workshop 🕒 02:41 - Overview of Hand Tracking System Features ⚙️ CORE CONCEPTS 🕒 05:35 - System-Backed Gestures & Interaction SDK 🕒 08:25 - Tour of Interaction SDK Capabilities 🕒 11:48 - Design Principles & Best Practices 🛠️ Workflow & Demo 🕒 14:23 - Project Setup with Meta Quest Developer Hub 🕒 20:39 - Live Demo: Building UI Interactions 🕒 34:02 - Live Demo: Adding Grabbable 3D Objects ✅ FINAL THOUGHTS 🕒 37:15 - Q&A: Advanced Use Cases & Debugging 📖 HAND TRACKING EXAMPLES REFERENCED IN THIS VIDEO 🔖 Interaction SDK Overview: https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ 🔖 Hand Tracking Design Guidelines: https://developers.meta.com/horizon/design/hands/ 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
32Views0likes0CommentsHow to Understand your Player and their Space : Context Aware MR
Hey folks! I just ran last week a small workshop about Context-Aware MR on how to understand your player and their space. I wanted to drop a summary here, and I’m looking forward to your tips, ideas, or comments. 😊 But, here is the big question. What are the tool we can use to really “understand” a player? Here are the four musketeers of Context Aware MR : Player Input, MRUK, Depth API, PCA. 1) Player Input You might think player input is a bit trivial but there’s actually a lot we can extract from it to understand our player behaviour. Of course, we get the basics: the position and rotation of the head and controllers, plus input from buttons or triggers. But that’s just the beginning. With features like hand tracking and voice recognition, we unlock much more. Hand tracking lets us detect custom hand poses or gestures. Voice recognition allows for voice commands and even detecting voice loudness, which can be used to animate a character’s mouth or trigger actions like blowing out a candle. By combining head and controller tracking, we can figure out not only where the player is looking but also estimate their overall body pose. These are just a few examples, but the creative possibilities are huge. While these features also apply to standard VR, let’s now move to tools that are specific to Mixed Reality starting with our next musketeer : MRUK ! 2) MRUK To understand MRUK I need to first explain what the Scene Data is. The Scene Data is an approximation of the player's environment, set up outside of your app through the Meta system. It gives you access to either a full mesh or a simplified version of the room using labeled boxes that lets you identify if this elements is a wall, door, floors or a furniture. The Mixed Reality Utility Kit (MRUK) is a powerful set of tools built on top of Scene Data. It helps you place, align, and make virtual content interact with the real world. Here are some examples of what MRUK enables: Smart spawn points on specific surfaces (like the floor or walls) while avoiding obstacles (like furniture) Collision for your virtual content Navmesh to move object around the player’s space without bumping into real-world elements Destructible scene mesh effects Dynamic lighting effects on real world QR code and keyboard tracking And more... While MRUK is incredibly useful, keep in mind that Scene Data doesn’t update in real time. That’s where the Depth API comes in. 3) Depth API The Depth API gives you real-time depth maps of what the user is currently seeing. This allows you to occlude virtual objects behind real-world elements, making them feel like a natural part of the environment and greatly increasing immersion. It also comes with a Depth Raycast Manager, which lets you detect collisions at runtime with real objects perfect for dynamic content placement or interactions. It’s a great complement to the Scene Model, filling in the gaps that static scene data can’t cover. Despite its potential, it's still underused in many XR projects. 4) Passthrough Camera Access (PCA) We’ve had the first three tools for a while now. But recently, a game-changing feature was introduced: access to the passthrough camera! With access to the camera, you can: Read the live image as a texture to do color picking, light estimation, or apply visual effects like blur Feed the image to AI models for computer vision tasks like object detection It opens a direct bridge between the real world and AI and that's huge for MR development. Good news: starting with version v83, new building blocks are available to help you set up PCA easily in your project. To Conclude Player Input, MRUK, Depth API, and Passthrough Camera Access form a powerful toolbox for building context-aware MR experiences. And now, with tools like PCA, creativity is more accessible than ever. We can finally build apps and games that truly adapt to each user and their real space. Hope you enjoyed this little summary a nd that you learned something new along the way. Go check out the different link provided in the post if you want to learn more about our 4 musqueteers and if you have a tip on how you these features in your app share them down bellow! 😊 Usefuls link : https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview https://developers.meta.com/horizon/documentation/unity/unity-mr-utility-kit-overview https://developers.meta.com/horizon/documentation/unity/unity-depthapi-overview https://developers.meta.com/horizon/documentation/unity/unity-pca-overview Have a nice day! 👋89Views1like0CommentsThe Complete List of Sample Unity VR Projects
Hey guys, I wanted to put together a list of my favorite sample projects that you can grab and learn from. In my opinion, these projects are pure goldmines, they don’t just showcase design principles around specific features but also provide direct examples of how to use them, which is especially important right now for something like a hackathon. For an even larger collection of Meta samples, see the GitHub list of all Meta sample repos here: https://github.com/orgs/oculus-samples/repositories?type=all Let’s start with our first category, the interaction samples. Interaction Samples Meta XR All-In-One (Interaction SDK) Sample Links: https://github.com/oculus-samples/Unity-InteractionSDK-Samples https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 Description: A comprehensive demo from Meta’s XR Interaction SDK featuring core VR interactions like poking, grabbing, raycasting, UI, and locomotion, all working together. Perfect to understand how to integrate both hands and controllers in one system. First Hand Link: https://github.com/oculus-samples/Unity-FirstHand Description: A full VR game demo focused on hand-tracked interactions. It showcases a complete Unity experience using the Interaction SDK with hand tracking as the main input and controller fallback. XR Interaction Toolkit Examples Link: https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples Description: Unity’s official XR Interaction Toolkit samples showing how to implement locomotion, selection, grabbing, and UI interactions. A solid starting point for setting up XR Origin and interactor/interactable components. Move Fast Link: https://github.com/oculus-samples/Unity-MoveFast Description: A fast-paced VR fitness demo using hand tracking and the Interaction SDK. The sample shows how to build an energetic workout game with responsive, punch-based interactions. Whisperer Link: https://github.com/oculus-samples/voicesdk-samples-whisperer Description: A voice-controlled VR experience demonstrating the Meta Voice SDK. Use voice commands as part of gameplay to learn how to integrate real-time voice recognition into your own projects. Tilt Brush (Open Brush) Link: https://github.com/icosa-foundation/open-brush Description: An open-source continuation of Google’s Tilt Brush. Lets users paint and sculpt in 3D space — an excellent reference for creative VR tools and spatial drawing. Multiplayer & Social Samples VR Multiplayer Sample (Unity XRI) Link: https://docs.unity3d.com/Packages/com.unity.template.vr-multiplayer@2.0/manual/index.html Description: Unity’s official multiplayer VR template featuring a prebuilt scene, avatars, and networking setup using Netcode for GameObjects. Great for learning multi-user interactions in VR. Mixed Reality Multiplayer (XR Multiplayer) Sample Link: https://docs.unity3d.com/Packages/com.unity.template.mr-multiplayer@1.0/manual/index.html Description: A tabletop MR multiplayer demo that includes avatars, voice chat, and shared AR/VR spaces. Features games like balloon slingshot and chess while teaching MR networking and colocation concepts. Tiny Golf Link: https://github.com/Meta-Horizon-Start-Program/Tiny-Golf Description: A free-to-play multiplayer mini-golf VR game created for the Meta Start program. Demonstrates basic physics, scoring, and networked multiplayer. Ultimate Glove Ball Link: https://github.com/oculus-samples/Unity-UltimateGloveBall Description: A VR e-sport showcase demonstrating multiplayer, avatars, voice, and in-app purchases. Integrates Photon networking and Oculus social APIs, making it a great reference for social competitive games. Spirit Sling Link: https://github.com/oculus-samples/Unity-SpiritSling Description: A social MR tabletop game letting players place a shared game board in real space and invite friends to join. Highlights Avatars SDK and MR colocated play. Decommissioned Link: https://github.com/oculus-samples/Unity-Decommissioned Description: A social-deduction VR game inspired by titles like Among Us. Shows how to handle multiplayer lobbies, Oculus invites, and social APIs in a networked Unity project. Mixed Reality (MR) Samples A World Beyond (Presence Platform Demo) Link: https://github.com/oculus-samples/Unity-TheWorldBeyond Description: A full MR showcase combining Scene Understanding, Passthrough, hand tracking, voice input, and spatial audio. A must-see for developers building immersive MR scenes blending real and virtual spaces. Phanto (MR Reference App) Links: https://github.com/oculus-samples/Unity-Phanto https://developers.meta.com/horizon/blog/phanto-unreal-showcase/ Description: An MR reference app focused on environmental awareness. Uses the Scene Mesh and MR APIs to blend gameplay with real-world geometry. Unity Discover (featuring Drone Rage and others) Links: https://www.meta.com/en-gb/experiences/discover/7041851792509764/ https://github.com/oculus-samples/Unity-Discover Description: A collection of MR showcase mini-experiences like Drone Rage. Demonstrates MR features including Passthrough, Spatial Anchors, and Shared Anchors in various game prototypes. MR Motifs Link: https://github.com/oculus-samples/Unity-MRMotifs Description: A library of MR “motifs”, small, reusable templates showcasing mechanics such as passthrough transitions, colocated multiplayer, and instant content placement. Cryptic Cabinet Link: https://github.com/oculus-samples/Unity-CrypticCabinet Description: A short MR escape-room experience that adapts to your room’s layout. Demonstrates interactive storytelling in mixed reality using environmental awareness. Passthrough Camera API Samples Link: https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples Description: A sample project demonstrating how to access and process Quest’s Passthrough camera feed for effects, object detection, and image manipulation. Tool and Utilities Asset Streaming Link: https://github.com/oculus-samples/Unity-AssetStreaming Description: An open-world streaming sample that shows how to dynamically load content using Addressables and LOD systems — ideal for maintaining performance in large VR environments. Shader Prewarmer Link: https://github.com/oculus-samples/Unity-ShaderPrewarmer Description: A utility sample that preloads shader variants at startup to eliminate hitching or stutters when shaders first compile — an important optimization for smooth VR performance. Complete Game Showcase Northstar Link: https://github.com/oculus-samples/Unity-NorthStar Description: A complete VR game showcasing advanced interaction and visual technique for VR. Featuring rope physics, narrative story telling, lip sync and more.723Views6likes2CommentsGetting Help, Filing Bugs, and Tracking Your Feedback
Building with Meta’s suite of SDKs and tools opens up a world of creative possibilities, but sometimes even the most experienced developers run into questions and challenges along the way. We know how much these hurdles can slow down your workflow, so we’re always working on providing new ways to support your development and keep you in the loop as we work to action your feedback. In this post, we’ll guide you through our support channels, show you how to file bugs effectively, and introduce an exciting new addition to our existing feedback tool - everything you need to get quick answers and keep creating. Meta Horizon Dev Blog | How to Get Help, File Bugs, and Track Fixes when Building with Meta Horizon37Views2likes0Comments