Building a Social VR Game From Scratch Part 1: Entitlement
So, I am building Baby VR, a social VR game that I will build with the community on YouTube. While planning the curriculum, I realized that before working on the core things like Networking, Voice Chat or Game Mechanics, we need to first integrate Baby VR into the Meta Horizon Store. And it starts with the Entitlement. So, in this blog post, I will show you how I did the Entitlement for the Meta Horizon Store. Let's get started. Introduction If you're building a VR app for Meta Quest, you absolutely need to implement entitlement checking. There's no way around it. Without it, anyone could potentially access your app without actually purchasing it from the Meta Quest Store. Think of entitlement as your app's bouncer - it checks if someone actually paid to get in before letting them through the door. Meta requires entitlement checks for apps published on their store, and it's really not optional if you want to protect your work and ensure users have legitimately obtained your application. According to Meta's official documentation. In this blog post, I'll walk you through a real-world implementation that handles all the edge cases - retry logic, error handling, and proper user data retrieval. Let's dive in. How It Works: The Complete Flow Before we get into the code, here's the big picture of how the entitlement process flows: The system consists of a few key components working together=> MetaStoreManager - The main orchestrator that kicks everything off EntitlementHandler - Does the heavy lifting of verification Event System - Notifies other parts of your game when entitlement completes MetaPlayerData - Stores the user info we retrieve Step-by-Step Implementation 1. The MetaStoreManager: Your Entry Point The `MetaStoreManager` is a Unity `MonoBehaviour` that orchestrates everything. It's simple - it initializes the entitlement handler and listens for when the entitlement completes: When you call `Initialize()`, it kicks off the entitlement process. Once complete, it stores the player data for use throughout your game. 2. The EntitlementHandler: The Core Logic This is where the real work happens. The handler performs a four-step verification process with automatic retry logic (up to 3 attempts with 2-second delays between retries): The `CheckEntitlement()` method runs four critical steps in sequence - if any step fails, the whole process fails and retries: Step 2 is the critical one- `CheckUserEntitlement()` calls `Entitlements.IsUserEntitledToApplication()` which queries Meta's servers to verify the user actually purchased your app. This is where the piracy protection happens. The other steps retrieve user data (ID, display name, Oculus ID) and generate a cryptographic proof (nonce) that you can use for server-side verification later. 3. The Data Structure After successful entitlement, you get a `MetaPlayerData` object containing: public class MetaPlayerData { public string UserId; // Unique user identifier public string UserName; // Display name public string AliasName; // Oculus ID public string OculusNonce; // Cryptographic proof for server verification } The`OculusNonce` is particularly important - it's a proof token you can send to your backend server to verify the user's identity securely. Best Practices When to check: Run entitlement as early as possible - ideally during your splash screen or initial loading. Don't let users access premium features until verification completes. Error handling: The implementation includes automatic retry logic (3 attempts with 2-second delays), but you should also show user-friendly error messages and provide a manual retry option if all attempts fail. Security: Never trust client-side verification alone. Always use the `OculusNonce` to verify user identity on your backend server for critical features. This prevents tampering and ensures real security. Performance: The async/await pattern keeps everything non-blocking, so your game stays responsive during the verification process. Common Issues and Solutions Entitlement always fails? Make sure your app is properly configured in the Meta Developer Dashboard, and test on a device that has actually purchased the app. Network issues can also cause failures. Platform not initializing? Verify the Oculus Platform SDK is properly imported and check your AndroidManifest.xml for required permissions. Also ensure you're testing on actual Quest hardware. User data not retrieved? The user needs to be logged into their Oculus account, and privacy settings might be blocking access. Check both the device settings and ensure you're using a compatible SDK version. Quick Integration Example Here's the basic pattern for using this in your game: Conclusion Meta Store entitlement isn't optional - it's a requirement for protecting your VR application. The implementation we've covered gives you: - ✅ Robust verification with automatic retry logic - ✅ Complete user data retrieval for personalization - ✅ Event-based architecture that keeps your code clean - ✅ Production-ready error handling Remember to test on actual Quest hardware, verify your app configuration in the Meta Developer Dashboard, and always implement server-side verification using the `OculusNonce` for critical features. This system provides a solid foundation that protects your app while keeping the user experience smooth. The retry logic handles network hiccups, and the event system keeps everything decoupled and maintainable. Let me know if you need the source code. Additional Resources Meta's Official Entitlement Check Documentation *This blog post is based on a production implementation. Always refer to the latest Meta documentation for the most up-to-date information and best practices.*18Views1like0CommentsIncrease Path Security with Attestation API | Mentor Workshop
App security is an integral part of your life ops reality if you ship on Quest. In this workshop, Start Mentor Shane Nilsson breaks down a practical approach to protecting your build from tampering and unauthorized distribution, with a focus on entitlement checks and the Meta Quest Attestation API. By the end, you will understand what Attestation is designed to verify and how to handle failed checks tactfully. 💡 After watching this session, you’ll be able to: Define a basic app security posture that combines entitlements with integrity checks Implement a server-mediated Attestation flow that resists spoofed responses and replay attempts Choose a response strategy for failed verification that fits your game and your player experience goals. Recorded in August 2025 as part of the Meta Horizon Start program. 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Welcome & why app security matters 🧠 THE PIRACY LANDSCAPE 🕒 04:07 - Understanding common vulnerabilities 🕒 08:32 - Practical, non-technical ways to reduce abuse 🕒 17:21 - How to detect suspicious patterns and decide on responses 🛡️ SECURITY TOOLS & IMPLEMENTATION 🕒 19:24 - Entitlements & the Attestation API overview 🕒 27:29 - Demo: verification outcomes and handling options 🕒 31:31 - Technical Deep Dive: API Implementation ✅ FINAL THOUGHTS 🕒 31:31 – Implementation walkthrough and required components 🕒 37:37 - Q&A & Best Practices 📚 RESOURCES 🔖 Meta Quest Attestation API: https://developers.meta.com/horizon/documentation/native/ps-attestation-api ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
34Views0likes1CommentHow to Polish your XR Title : Onboarding, Optimization and Game Feel
Hi guys, I recently ran a workshop for the Meta Horizon Start Competition focused on polishing an XR title. At that stage of the hackathon, with only a week left before submission, one question mattered more than any other: How do you make the biggest impact in the least amount of time? This post is a written summary of that workshop aiming at answering this question around three topics: Player onboarding Optimization Game feel But why these three? Because when you are close to shipping, it becomes critical to look at your application through the eyes of a player, not a developer. For example : The build can behave very differently on device compared to the editor, which makes optimization and debugging essential. If players do not understand what to do, onboarding needs improvement. If it works but feels flat or unsatisfying, game feel needs polish. Together, these areas tend to give the highest return on investment when time is limited. Let’s start with player onboarding, using the Meta All in One SDK. Player Onboarding Player onboarding is how you teach players to understand and interact with your game. A useful principle to keep in mind is: The best tutorial is no tutorial. Ideally, players learn by doing. You introduce interactions one at a time, following the player’s natural progression through the experience. That said, refining this approach requires time and playtesting, which is often limited near the end of a project. When time is short, these faster onboarding tools can help communicate essential information clearly. Tooltips Short, spatialized text pointing at an object or interaction. Best used to highlight a short action. UI Panels The most basic solution, using Unity UI to display text, images, or short videos. Ghost Hands Very effective for hand tracking. You can visually demonstrate a hand pose or motion. Tip: With hand tracking, you can click on play, do a certain pose with your own hand, then drag the hand model from the hierarchy to the project window to save it as a prefab that you can reuse and that will directly have the pose you want to show. Voice Prompts Recording short voice lines can be faster than writing UI and often feels more natural. Keep them short and contextual. I provide a Unity package with a sample scene demonstrating each of these approaches here: https://drive.google.com/file/d/1n3IUzLMH6_60foStzgSdR5WignTkWUFh/view?usp=sharing Here are three more guidelines about onboarding : Optimization Optimization is crucial. It is not only about meeting Meta VRC requirements so your app can ship, but also about player comfort and enjoyment. Below is a simple optimization workflow that works well even for beginners and usually resolves most common issues. Step 1: Check your project settings Many performance problems come from incorrect project setup. Use the Project Setup Tool from the Meta SDK Ensure Single Pass Instanced is enabled Review URP settings and quality levels Bake your lights whenever possible For a full setup checklist, see this video: https://youtu.be/BeB9Cx_msKA?si=AjfnZdoPxH3jPxk- Project Validation Tool in Action Step 2: Check your scene complexity Triangle budgets vary depending on your content, but a good general target is around 150,000 triangles visible at once. Draw calls are just as important, and often more critical so keep them as low as possible (under 80). Ways to reduce complexity: Lower triangle count per model Use frustum and occlusion culling Add LODs to complex meshes Use batching techniques where possible For more information about the target fps, drawcalls and triangle count, go check out this meta documentation page: https://developers.meta.com/horizon/documentation/unity/unity-perf Step 3: Use OVR Metrics Install OVR Metrics and always test outside the Unity editor. OVR Metrics gives you real performance data on device, with live graphs and indicators. Here are the three most important data to track : FPS GPU and CPU usage Render Scale (must stay above 0.85 to meet VRC requirements) You can download it here : https://www.meta.com/en-gb/experiences/ovr-metrics-tool/2372625889463779/ Play your build and look for frame drops, reduced render scale, or sustained high GPU or CPU usage. Step 4: Find and remove your bottleneck The previous step tells you when and where performance drops occur. Now you need to understand why. Use the right tools: Unity Profiler Identify CPU bottlenecks, scripts taking too long, physics spikes, or garbage collection issues. Frame Debugger Analyze draw calls and rendering passes to understand what is actually being rendered each frame. Meta Runtime Optimizer Helps identify XR specific performance issues related to the runtime and rendering pipeline. Once you know the bottleneck, you can make targeted changes instead of guessing. Step 5: Last resort performance boosts When time is very limited, two features can significantly improve performance with minimal effort: Dynamic Resolution Fixed Foveated Rendering Both offer configurable levels to balance performance and visual quality. Be careful when lowering values too aggressively, visual quality can degrade quickly. Always remember that Render Scale must stay above 85 percent to pass VRC. Fixed Foveated Rendering Applied to Eye Texture Game Feel Game feel is often underestimated, but small improvements here can dramatically improve how polished your XR experience feels. Moreover having responsiveness also helps guide the player to understand your game and help onboarding. Haptic feedback Haptics add physical feedback to interactions and are extremely effective in XR. Triggering haptics is often just a single line of code. If you want more advanced effects, you can build layered patterns using a haptic tool or studio. Tutorial: https://youtu.be/RUUwWMkXFt0?si=1L92-NwIL9xy4CfJ Grab poses Hand grab poses let you enforce a custom hand pose when grabbing an object. The pose can vary depending on how or where the object is grabbed. For example, a cup grabbed by the body uses a different pose than grabbing it by the handle. This small detail greatly improves realism and comfort. Documentation: https://developers.meta.com/horizon/documentation/unity/unity-isdk-creating-handgrab-poses/ Sound design Sound is often forgotten, yet incredibly important. Add sounds to every interaction Use pitch variation for natural randomness Adjust volume based on the intensity of the action The Meta All in One SDK already includes UI focused audio, but Meta also provides a free audio pack with more general sounds: https://developers.meta.com/horizon/downloads/package/oculus-audio-pack-1/ Outro So here it is guys, these are the most impactful steps you can take to polish an XR title when time is limited. I hope this breakdown is helpful. Feel free to share your own tips and tricks below on how you polish your XR titles.50Views1like0Comments🎉Welcome new Start Mentor @Degly / Degly Pava🎉
Please join us in welcoming Degly, our newest Start Mentor and Founder of ColombiaXR. Degly is an experienced AR and VR Unity developer who has organized hackathons, workshops, and mentoring programs across Colombia and Latin America for hundreds of developers. He’s currently based in Tokyo, and is excited to help you use Meta’s toolkit to lower the barrier for entry for creating world class experiences with cutting-edge technologies for use across the world. Degly is an expert in XR App Visibility & Growth Strategy Unity & Spatial App Development XR Community Building and Social Presence Career Paths & Navigating the XR Industry42Views2likes1CommentWhat's Trending in VR Development? | Tech, Content & Community
What does it take to succeed as a VR developer today? Join renowned YouTube creators and expert developers Dilmer Valecillos, Quentin Valembois, and Roberto Coviello in a roundtable discussion hosted by Darby Blaney. They explore the latest tech trends, such as the Passthrough Camera Access API and essential optimization tools, and share proven strategies for accelerating your workflow, creating content, and building an engaged community. 💡 In this session, you’ll learn: Insights into the latest VR technologies, including the creative potential of the Passthrough Camera Access API and underutilized tools like the Immersive Debugger. Best practices for optimizing performance and accelerating your development workflow using tools like Building Blocks and open-source projects. Proven strategies for building a personal brand, creating effective content, and fostering an engaged community around your work. Recorded in October 2025 as part of the Meta Horizon Start program. 🎙️ FEATURING: Darby Blaney, Metaverse Program Manager, Start Lead Roberto Coviello, Software Engineer Quentin Valembois, Start Mentor Dilmer Valecillos, Developer Advocate 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Welcome & Host Introduction 🚀 PART 1: THE DEVELOPER JOURNEY 🕒 02:13 - Panelist Origin Stories: How They Got Started in VR ⚙️ PART 2: TECH TRENDS & TOOLS 🕒 08:22 - Tech Trend: The Passthrough Camera API & AI 🕒 13:12 - Underutilized Tools: Immersive Debugger & Runtime Optimizer 🕒 17:12 - Best Practices for Performance Optimization 🕒 19:41 - Accelerating Development with Building Blocks & Open-Source Samples 📈 PART 3: CONTENT & COMMUNITY GROWTH 🕒 23:20 - Strategies for Content Creation & Building a Personal Brand 🕒 28:18 - The Role of Community Feedback in Development 🕒 32:47 - Inside the Meta Start Mentor Program ✅ CONCLUSION 🕒 35:06 - Final Thoughts & Thank You 📚 RESOURCES ➡️ Dilmer’s YouTube channel: https://www.youtube.com/@dilmerv ➡️ Valem’s YouTube channel: https://www.youtube.com/@ValemTutorials ➡️ Roberto’s YouTube channel: https://www.youtube.com/@xrdevrob ➡️ Spatial Scanner project available on GitHub: https://github.com/meta-quest/Meta-Spatial-SDK-Samples ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
9Views0likes0CommentsHow to Understand your Player and their Space : Context Aware MR
Hey folks! I just ran last week a small workshop about Context-Aware MR on how to understand your player and their space. I wanted to drop a summary here, and I’m looking forward to your tips, ideas, or comments. 😊 But, here is the big question. What are the tool we can use to really “understand” a player? Here are the four musketeers of Context Aware MR : Player Input, MRUK, Depth API, PCA. 1) Player Input You might think player input is a bit trivial but there’s actually a lot we can extract from it to understand our player behaviour. Of course, we get the basics: the position and rotation of the head and controllers, plus input from buttons or triggers. But that’s just the beginning. With features like hand tracking and voice recognition, we unlock much more. Hand tracking lets us detect custom hand poses or gestures. Voice recognition allows for voice commands and even detecting voice loudness, which can be used to animate a character’s mouth or trigger actions like blowing out a candle. By combining head and controller tracking, we can figure out not only where the player is looking but also estimate their overall body pose. These are just a few examples, but the creative possibilities are huge. While these features also apply to standard VR, let’s now move to tools that are specific to Mixed Reality starting with our next musketeer : MRUK ! 2) MRUK To understand MRUK I need to first explain what the Scene Data is. The Scene Data is an approximation of the player's environment, set up outside of your app through the Meta system. It gives you access to either a full mesh or a simplified version of the room using labeled boxes that lets you identify if this elements is a wall, door, floors or a furniture. The Mixed Reality Utility Kit (MRUK) is a powerful set of tools built on top of Scene Data. It helps you place, align, and make virtual content interact with the real world. Here are some examples of what MRUK enables: Smart spawn points on specific surfaces (like the floor or walls) while avoiding obstacles (like furniture) Collision for your virtual content Navmesh to move object around the player’s space without bumping into real-world elements Destructible scene mesh effects Dynamic lighting effects on real world QR code and keyboard tracking And more... While MRUK is incredibly useful, keep in mind that Scene Data doesn’t update in real time. That’s where the Depth API comes in. 3) Depth API The Depth API gives you real-time depth maps of what the user is currently seeing. This allows you to occlude virtual objects behind real-world elements, making them feel like a natural part of the environment and greatly increasing immersion. It also comes with a Depth Raycast Manager, which lets you detect collisions at runtime with real objects perfect for dynamic content placement or interactions. It’s a great complement to the Scene Model, filling in the gaps that static scene data can’t cover. Despite its potential, it's still underused in many XR projects. 4) Passthrough Camera Access (PCA) We’ve had the first three tools for a while now. But recently, a game-changing feature was introduced: access to the passthrough camera! With access to the camera, you can: Read the live image as a texture to do color picking, light estimation, or apply visual effects like blur Feed the image to AI models for computer vision tasks like object detection It opens a direct bridge between the real world and AI and that's huge for MR development. Good news: starting with version v83, new building blocks are available to help you set up PCA easily in your project. To Conclude Player Input, MRUK, Depth API, and Passthrough Camera Access form a powerful toolbox for building context-aware MR experiences. And now, with tools like PCA, creativity is more accessible than ever. We can finally build apps and games that truly adapt to each user and their real space. Hope you enjoyed this little summary a nd that you learned something new along the way. Go check out the different link provided in the post if you want to learn more about our 4 musqueteers and if you have a tip on how you these features in your app share them down bellow! 😊 Usefuls link : https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview https://developers.meta.com/horizon/documentation/unity/unity-mr-utility-kit-overview https://developers.meta.com/horizon/documentation/unity/unity-depthapi-overview https://developers.meta.com/horizon/documentation/unity/unity-pca-overview Have a nice day! 👋57Views1like0CommentsFast XR Dev Lessons Part 2: Add Multiplayer & High-Value Interactions!
Welcome to Part 2 of our XR development series! In this part, we’re diving into one of the most powerful ways to boost replayability, retention, and organic growth in your VR/MR projects: Multiplayer + High-Value Interactions If you’re building anything social, competitive, collaborative, or chaotic… this one is for you. Why Social VR Is a Game Changer 36 of the top 50 games on the Meta Store are Multiplayer. This is HUGE. Social VR is exploding — and for a good reason: Shared presence creates moments that stick: Instant emotional connection Longer play sessions More organic sharing Natural viral growth When players laugh, emote, race, or mess around together, your game grows naturally. Multiplayer isn’t just a feature. It’s a retention engine. Start With a Core Social Loop Don’t “add multiplayer.” Add a social purpose. Ask yourself: What’s the 10-second moment so fun that someone would show it to a friend? That moment becomes the anchor for your multiplayer design. Examples: A chaotic mechanic players can trigger on each other A funny avatar reaction or voice filter A toy that’s satisfying to throw A mini-event everyone experiences together Simple, reactive loops beat giant maps every time. High-Value Multiplayer Interactions Players don’t remember complex systems. They remember moments. High-value interactions are simple but highly reactive, such as: Throwables that bounce, squeak, or explode harmlessly Avatars that laugh, wobble, or flinch when hit Shared countdowns, race starts, or victory poses Objects that can be grabbed, tossed, or used together Funny sound effects or physics surprises Tiny touches → big delight. 🎥 Design for Viral Moments Think like TikTok. What would someone WANT to record? Comedic physics Funny voice modifiers Dramatic win/lose screens End-of-round celebrations Goofy gadgets or interactions Build “clip-able” moments, and your players become your marketers. Networking Is Hard (So Scope Carefully) Multiplayer multiplies everything: More systems More complexity More edge cases More testing time And VR multiplayer is even harder to test than flatscreen. So: Plan smaller, give yourself time, and playtest constantly. Keep your core loop tiny until it works perfectly with two players. Networking Frameworks (Choose What Fits You) Different frameworks suit different projects and teams: Photon (Fusion / PUN) Flexible topologies Large sample library Highly scalable Great for action-heavy multiplayer Unity Netcode for GameObjects First-party Unity VR Multiplayer Template available Ideal for Unity-native developers Normcore VR-first design Built-in spatial VoIP Fastest to set up Great for social sandbox worlds Your choice should match your timeline, complexity, and especially your skillset. Multiplayer Testing Without Losing Your Mind My favorite free tool: ➡️ ParrelSync Run multiple Unity editor instances — no builds required. Massive time-saver. Huge sanity-saver. Perfect for early multiplayer iteration. Multiplayer Building Blocks (A Huge Head Start) Meta’s Multiplayer Building Blocks include: Matchmaking Friends-only rooms Player name tags Networked avatars Networked grabbable objects Voice chat Shared spatial anchors If you're new: start here. If you're experienced, this still accelerates development. Final Thoughts Immersive social experiences will only continue to grow. So learning how to build them is one of the most valuable skills in XR. You don’t need massive environments or complex systems — just meaningful shared interactions. Start small. Prototype fast. Build for memorable moments. Let the players create the magic. Part 3 is coming soon — stay tuned! – Tevfik86Views3likes0CommentsFast XR Dev Lessons Part 1: Setup Your Project & Plan an Achievable Scope!
Welcome to Part 1 of our XR development journey! In this post, I’ll be sharing key takeaways from my recent session: “Fast Essentials: Setup Your Project & Plan an Achievable Scope” — a practical guide for turning an XR idea into something you can actually finish and publish. Mixed Reality (MR) and Virtual Reality (VR) development can be overwhelming… new tools, spatial computing concepts, scene understanding, multiplayer, locomotion, input systems — and before we know it, scope explodes. This guide helps you avoid that trap and build smarter. If you’ve ever started a VR project that never shipped, this is for you. Why This Matters: The Core Goal The mission is simple: Go from idea → plan → playable MVP → something you can publish. Instead of feature soup, we focus on a small, testable, end-to-end experience that can grow after launch. This approach is used across top Meta Horizon titles and is especially critical for small indie teams and solo developers. 💡 Step 1 – IDEATION: Start With a Spark Great XR projects begin with concepts that are: ✨ Fun ✨ Clever ✨ Emotionally strong ✨ Easy to explain in 30s Before building: ✔ Check similar apps on the Meta Horizon Store ✔ Understand what works ✔ Spot what you can do better or differently If you see a gap or an improvement opportunity, that is your first green light. Tip: I generate quick concept visuals using ChatGPT + Gemini/Banana to imagine tone and characters. Step 2 – Ask: Can this idea grow forever? A successful XR concept must be Forever Updatable: Can you add later without redesigning the whole game? Examples of expandable updates: new maps/rooms/environments new challenges or props cosmetics, skins, power-ups community-driven content Players don’t play your game. They play your updates. If you cannot imagine new content in 3 months, rethink the idea. Step 3 – Scope the MVP: Build What You Can Actually Ship Your MVP should be: ✔ Playable end-to-end ✔ Testable weekly ✔ Explainable in 30 seconds The MVP Formula: One Map One Tool/Ability One Obstacle/Challenge Complete loop: Load → Play → End → Restart If you can’t prototype it in a weekend, the scope is still too big. Cut features. Then cut again. ✂️ Step 4 – Create a Lightweight GDD (Game Design Document) Keep it simple and editable. A living document. Recommended sections: GDD Element Purpose High Concept One-sentence pitch Core Loop What players do repeatedly Player Character Who are they in this world? Chaos Layer The unpredictable fun/VR magic Progression What keeps players coming back Social Hub (Optional) if XR social features Art & Style Visual identity Tech Specs Quest, PCVR, MR features Monetization Cosmetic? DLC? Battle pass? Change it fast. Don’t over-engineer it. Final Thoughts Whether you're building VR, MR, or full spatial apps: Start small Test fast Publish sooner Grow forever 💬Join the Discussion! Reply with your thoughts: What XR idea would YOU scope into a small MVP? What part of scoping do you struggle with most? Want a downloadable Notion GDD template? I’d love to see your concepts. — Tevfik Meta Horizon Start Mentor100Views2likes3CommentsThe Complete List of Sample Unity VR Projects
Hey guys, I wanted to put together a list of my favorite sample projects that you can grab and learn from. In my opinion, these projects are pure goldmines, they don’t just showcase design principles around specific features but also provide direct examples of how to use them, which is especially important right now for something like a hackathon. For an even larger collection of Meta samples, see the GitHub list of all Meta sample repos here: https://github.com/orgs/oculus-samples/repositories?type=all Let’s start with our first category, the interaction samples. Interaction Samples Meta XR All-In-One (Interaction SDK) Sample Links: https://github.com/oculus-samples/Unity-InteractionSDK-Samples https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 Description: A comprehensive demo from Meta’s XR Interaction SDK featuring core VR interactions like poking, grabbing, raycasting, UI, and locomotion, all working together. Perfect to understand how to integrate both hands and controllers in one system. First Hand Link: https://github.com/oculus-samples/Unity-FirstHand Description: A full VR game demo focused on hand-tracked interactions. It showcases a complete Unity experience using the Interaction SDK with hand tracking as the main input and controller fallback. XR Interaction Toolkit Examples Link: https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples Description: Unity’s official XR Interaction Toolkit samples showing how to implement locomotion, selection, grabbing, and UI interactions. A solid starting point for setting up XR Origin and interactor/interactable components. Move Fast Link: https://github.com/oculus-samples/Unity-MoveFast Description: A fast-paced VR fitness demo using hand tracking and the Interaction SDK. The sample shows how to build an energetic workout game with responsive, punch-based interactions. Whisperer Link: https://github.com/oculus-samples/voicesdk-samples-whisperer Description: A voice-controlled VR experience demonstrating the Meta Voice SDK. Use voice commands as part of gameplay to learn how to integrate real-time voice recognition into your own projects. Tilt Brush (Open Brush) Link: https://github.com/icosa-foundation/open-brush Description: An open-source continuation of Google’s Tilt Brush. Lets users paint and sculpt in 3D space — an excellent reference for creative VR tools and spatial drawing. Multiplayer & Social Samples VR Multiplayer Sample (Unity XRI) Link: https://docs.unity3d.com/Packages/com.unity.template.vr-multiplayer@2.0/manual/index.html Description: Unity’s official multiplayer VR template featuring a prebuilt scene, avatars, and networking setup using Netcode for GameObjects. Great for learning multi-user interactions in VR. Mixed Reality Multiplayer (XR Multiplayer) Sample Link: https://docs.unity3d.com/Packages/com.unity.template.mr-multiplayer@1.0/manual/index.html Description: A tabletop MR multiplayer demo that includes avatars, voice chat, and shared AR/VR spaces. Features games like balloon slingshot and chess while teaching MR networking and colocation concepts. Tiny Golf Link: https://github.com/Meta-Horizon-Start-Program/Tiny-Golf Description: A free-to-play multiplayer mini-golf VR game created for the Meta Start program. Demonstrates basic physics, scoring, and networked multiplayer. Ultimate Glove Ball Link: https://github.com/oculus-samples/Unity-UltimateGloveBall Description: A VR e-sport showcase demonstrating multiplayer, avatars, voice, and in-app purchases. Integrates Photon networking and Oculus social APIs, making it a great reference for social competitive games. Spirit Sling Link: https://github.com/oculus-samples/Unity-SpiritSling Description: A social MR tabletop game letting players place a shared game board in real space and invite friends to join. Highlights Avatars SDK and MR colocated play. Decommissioned Link: https://github.com/oculus-samples/Unity-Decommissioned Description: A social-deduction VR game inspired by titles like Among Us. Shows how to handle multiplayer lobbies, Oculus invites, and social APIs in a networked Unity project. Mixed Reality (MR) Samples A World Beyond (Presence Platform Demo) Link: https://github.com/oculus-samples/Unity-TheWorldBeyond Description: A full MR showcase combining Scene Understanding, Passthrough, hand tracking, voice input, and spatial audio. A must-see for developers building immersive MR scenes blending real and virtual spaces. Phanto (MR Reference App) Links: https://github.com/oculus-samples/Unity-Phanto https://developers.meta.com/horizon/blog/phanto-unreal-showcase/ Description: An MR reference app focused on environmental awareness. Uses the Scene Mesh and MR APIs to blend gameplay with real-world geometry. Unity Discover (featuring Drone Rage and others) Links: https://www.meta.com/en-gb/experiences/discover/7041851792509764/ https://github.com/oculus-samples/Unity-Discover Description: A collection of MR showcase mini-experiences like Drone Rage. Demonstrates MR features including Passthrough, Spatial Anchors, and Shared Anchors in various game prototypes. MR Motifs Link: https://github.com/oculus-samples/Unity-MRMotifs Description: A library of MR “motifs”, small, reusable templates showcasing mechanics such as passthrough transitions, colocated multiplayer, and instant content placement. Cryptic Cabinet Link: https://github.com/oculus-samples/Unity-CrypticCabinet Description: A short MR escape-room experience that adapts to your room’s layout. Demonstrates interactive storytelling in mixed reality using environmental awareness. Passthrough Camera API Samples Link: https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples Description: A sample project demonstrating how to access and process Quest’s Passthrough camera feed for effects, object detection, and image manipulation. Tool and Utilities Asset Streaming Link: https://github.com/oculus-samples/Unity-AssetStreaming Description: An open-world streaming sample that shows how to dynamically load content using Addressables and LOD systems — ideal for maintaining performance in large VR environments. Shader Prewarmer Link: https://github.com/oculus-samples/Unity-ShaderPrewarmer Description: A utility sample that preloads shader variants at startup to eliminate hitching or stutters when shaders first compile — an important optimization for smooth VR performance. Complete Game Showcase Northstar Link: https://github.com/oculus-samples/Unity-NorthStar Description: A complete VR game showcasing advanced interaction and visual technique for VR. Featuring rope physics, narrative story telling, lip sync and more.337Views6likes2Comments