The Complete List of Sample Unity VR Projects
Hey guys, I wanted to put together a list of my favorite sample projects that you can grab and learn from. In my opinion, these projects are pure goldmines, they don’t just showcase design principles around specific features but also provide direct examples of how to use them, which is especially important right now for something like a hackathon. For an even larger collection of Meta samples, see the GitHub list of all Meta sample repos here: https://github.com/orgs/oculus-samples/repositories?type=all Let’s start with our first category, the interaction samples. Interaction Samples Meta XR All-In-One (Interaction SDK) Sample Links: https://github.com/oculus-samples/Unity-InteractionSDK-Samples https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 Description: A comprehensive demo from Meta’s XR Interaction SDK featuring core VR interactions like poking, grabbing, raycasting, UI, and locomotion, all working together. Perfect to understand how to integrate both hands and controllers in one system. First Hand Link: https://github.com/oculus-samples/Unity-FirstHand Description: A full VR game demo focused on hand-tracked interactions. It showcases a complete Unity experience using the Interaction SDK with hand tracking as the main input and controller fallback. XR Interaction Toolkit Examples Link: https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples Description: Unity’s official XR Interaction Toolkit samples showing how to implement locomotion, selection, grabbing, and UI interactions. A solid starting point for setting up XR Origin and interactor/interactable components. Move Fast Link: https://github.com/oculus-samples/Unity-MoveFast Description: A fast-paced VR fitness demo using hand tracking and the Interaction SDK. The sample shows how to build an energetic workout game with responsive, punch-based interactions. Whisperer Link: https://github.com/oculus-samples/voicesdk-samples-whisperer Description: A voice-controlled VR experience demonstrating the Meta Voice SDK. Use voice commands as part of gameplay to learn how to integrate real-time voice recognition into your own projects. Tilt Brush (Open Brush) Link: https://github.com/icosa-foundation/open-brush Description: An open-source continuation of Google’s Tilt Brush. Lets users paint and sculpt in 3D space — an excellent reference for creative VR tools and spatial drawing. Multiplayer & Social Samples VR Multiplayer Sample (Unity XRI) Link: https://docs.unity3d.com/Packages/com.unity.template.vr-multiplayer@2.0/manual/index.html Description: Unity’s official multiplayer VR template featuring a prebuilt scene, avatars, and networking setup using Netcode for GameObjects. Great for learning multi-user interactions in VR. Mixed Reality Multiplayer (XR Multiplayer) Sample Link: https://docs.unity3d.com/Packages/com.unity.template.mr-multiplayer@1.0/manual/index.html Description: A tabletop MR multiplayer demo that includes avatars, voice chat, and shared AR/VR spaces. Features games like balloon slingshot and chess while teaching MR networking and colocation concepts. Tiny Golf Link: https://github.com/Meta-Horizon-Start-Program/Tiny-Golf Description: A free-to-play multiplayer mini-golf VR game created for the Meta Start program. Demonstrates basic physics, scoring, and networked multiplayer. Ultimate Glove Ball Link: https://github.com/oculus-samples/Unity-UltimateGloveBall Description: A VR e-sport showcase demonstrating multiplayer, avatars, voice, and in-app purchases. Integrates Photon networking and Oculus social APIs, making it a great reference for social competitive games. Spirit Sling Link: https://github.com/oculus-samples/Unity-SpiritSling Description: A social MR tabletop game letting players place a shared game board in real space and invite friends to join. Highlights Avatars SDK and MR colocated play. Decommissioned Link: https://github.com/oculus-samples/Unity-Decommissioned Description: A social-deduction VR game inspired by titles like Among Us. Shows how to handle multiplayer lobbies, Oculus invites, and social APIs in a networked Unity project. Mixed Reality (MR) Samples A World Beyond (Presence Platform Demo) Link: https://github.com/oculus-samples/Unity-TheWorldBeyond Description: A full MR showcase combining Scene Understanding, Passthrough, hand tracking, voice input, and spatial audio. A must-see for developers building immersive MR scenes blending real and virtual spaces. Phanto (MR Reference App) Links: https://github.com/oculus-samples/Unity-Phanto https://developers.meta.com/horizon/blog/phanto-unreal-showcase/ Description: An MR reference app focused on environmental awareness. Uses the Scene Mesh and MR APIs to blend gameplay with real-world geometry. Unity Discover (featuring Drone Rage and others) Links: https://www.meta.com/en-gb/experiences/discover/7041851792509764/ https://github.com/oculus-samples/Unity-Discover Description: A collection of MR showcase mini-experiences like Drone Rage. Demonstrates MR features including Passthrough, Spatial Anchors, and Shared Anchors in various game prototypes. MR Motifs Link: https://github.com/oculus-samples/Unity-MRMotifs Description: A library of MR “motifs”, small, reusable templates showcasing mechanics such as passthrough transitions, colocated multiplayer, and instant content placement. Cryptic Cabinet Link: https://github.com/oculus-samples/Unity-CrypticCabinet Description: A short MR escape-room experience that adapts to your room’s layout. Demonstrates interactive storytelling in mixed reality using environmental awareness. Passthrough Camera API Samples Link: https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples Description: A sample project demonstrating how to access and process Quest’s Passthrough camera feed for effects, object detection, and image manipulation. Tool and Utilities Asset Streaming Link: https://github.com/oculus-samples/Unity-AssetStreaming Description: An open-world streaming sample that shows how to dynamically load content using Addressables and LOD systems — ideal for maintaining performance in large VR environments. Shader Prewarmer Link: https://github.com/oculus-samples/Unity-ShaderPrewarmer Description: A utility sample that preloads shader variants at startup to eliminate hitching or stutters when shaders first compile — an important optimization for smooth VR performance. Complete Game Showcase Northstar Link: https://github.com/oculus-samples/Unity-NorthStar Description: A complete VR game showcasing advanced interaction and visual technique for VR. Featuring rope physics, narrative story telling, lip sync and more.547Views6likes2CommentsBuilding a Social VR Game From Scratch Part 1: Entitlement
So, I am building Baby VR, a social VR game that I will build with the community on YouTube. While planning the curriculum, I realized that before working on the core things like Networking, Voice Chat or Game Mechanics, we need to first integrate Baby VR into the Meta Horizon Store. And it starts with the Entitlement. So, in this blog post, I will show you how I did the Entitlement for the Meta Horizon Store. Let's get started. Introduction If you're building a VR app for Meta Quest, you absolutely need to implement entitlement checking. There's no way around it. Without it, anyone could potentially access your app without actually purchasing it from the Meta Quest Store. Think of entitlement as your app's bouncer - it checks if someone actually paid to get in before letting them through the door. Meta requires entitlement checks for apps published on their store, and it's really not optional if you want to protect your work and ensure users have legitimately obtained your application. According to Meta's official documentation. In this blog post, I'll walk you through a real-world implementation that handles all the edge cases - retry logic, error handling, and proper user data retrieval. Let's dive in. How It Works: The Complete Flow Before we get into the code, here's the big picture of how the entitlement process flows: The system consists of a few key components working together=> MetaStoreManager - The main orchestrator that kicks everything off EntitlementHandler - Does the heavy lifting of verification Event System - Notifies other parts of your game when entitlement completes MetaPlayerData - Stores the user info we retrieve Step-by-Step Implementation 1. The MetaStoreManager: Your Entry Point The `MetaStoreManager` is a Unity `MonoBehaviour` that orchestrates everything. It's simple - it initializes the entitlement handler and listens for when the entitlement completes: When you call `Initialize()`, it kicks off the entitlement process. Once complete, it stores the player data for use throughout your game. 2. The EntitlementHandler: The Core Logic This is where the real work happens. The handler performs a four-step verification process with automatic retry logic (up to 3 attempts with 2-second delays between retries): The `CheckEntitlement()` method runs four critical steps in sequence - if any step fails, the whole process fails and retries: Step 2 is the critical one- `CheckUserEntitlement()` calls `Entitlements.IsUserEntitledToApplication()` which queries Meta's servers to verify the user actually purchased your app. This is where the piracy protection happens. The other steps retrieve user data (ID, display name, Oculus ID) and generate a cryptographic proof (nonce) that you can use for server-side verification later. 3. The Data Structure After successful entitlement, you get a `MetaPlayerData` object containing: public class MetaPlayerData { public string UserId; // Unique user identifier public string UserName; // Display name public string AliasName; // Oculus ID public string OculusNonce; // Cryptographic proof for server verification } The`OculusNonce` is particularly important - it's a proof token you can send to your backend server to verify the user's identity securely. Best Practices When to check: Run entitlement as early as possible - ideally during your splash screen or initial loading. Don't let users access premium features until verification completes. Error handling: The implementation includes automatic retry logic (3 attempts with 2-second delays), but you should also show user-friendly error messages and provide a manual retry option if all attempts fail. Security: Never trust client-side verification alone. Always use the `OculusNonce` to verify user identity on your backend server for critical features. This prevents tampering and ensures real security. Performance: The async/await pattern keeps everything non-blocking, so your game stays responsive during the verification process. Common Issues and Solutions Entitlement always fails? Make sure your app is properly configured in the Meta Developer Dashboard, and test on a device that has actually purchased the app. Network issues can also cause failures. Platform not initializing? Verify the Oculus Platform SDK is properly imported and check your AndroidManifest.xml for required permissions. Also ensure you're testing on actual Quest hardware. User data not retrieved? The user needs to be logged into their Oculus account, and privacy settings might be blocking access. Check both the device settings and ensure you're using a compatible SDK version. Quick Integration Example Here's the basic pattern for using this in your game: Conclusion Meta Store entitlement isn't optional - it's a requirement for protecting your VR application. The implementation we've covered gives you: - ✅ Robust verification with automatic retry logic - ✅ Complete user data retrieval for personalization - ✅ Event-based architecture that keeps your code clean - ✅ Production-ready error handling Remember to test on actual Quest hardware, verify your app configuration in the Meta Developer Dashboard, and always implement server-side verification using the `OculusNonce` for critical features. This system provides a solid foundation that protects your app while keeping the user experience smooth. The retry logic handles network hiccups, and the event system keeps everything decoupled and maintainable. Let me know if you need the source code. Additional Resources Meta's Official Entitlement Check Documentation *This blog post is based on a production implementation. Always refer to the latest Meta documentation for the most up-to-date information and best practices.*57Views3likes0CommentsVR Optimization in Unity - A Complete List of Tools and Resources
Hey guys, I wanted to put together a list of tools and resources for optimizing a VR game in Unity. As the process can be tricky (profiling, debugging, etc.) and there are a lot of different tools, I hope this helps show the different use cases and workflows. This list is a work in progress, so if you want to add your favorites or ask questions, please do it here. :) OVERVIEW OF OPTIMIZATION TOOLS In-Editor Stats Window Documentation: https://docs.unity3d.com/Manual/RenderingStatistics.html Shows real time rendering statistics (triangles, draw calls, batches) in the Game View to quickly spot bottlenecks. Profiler Documentation: https://docs.unity3d.com/Manual/Profiler.html Records CPU, GPU, memory, and other subsystem usage to identify hotspot frames or assets. Project Validation and Project Setup Tool Documentation: https://developers.meta.com/horizon/documentation/unity/unity-upst-overview/ Test a registry of rules called Configuration Tasks (xr setup and optimization) and provide default rules to make your project Meta Quest ready. Quest Runtime Optimizer https://developers.meta.com/horizon/documentation/unity/unity-quest-runtime-optimizer/ Provides real time analysis and actionable insights to optimize performance on device. Unity Project Auditor https://docs.unity3d.com/Packages/com.unity.project-auditor@1.0/manual/index.html Audits your Unity project for performance and best practice compliance. Frame Debugger Documentation: https://docs.unity3d.com/Manual/FrameDebugger.html Step through draw calls and frame rendering to find overdraw or inefficient passes. Auto VR Optimizer https://assetstore.unity.com/packages/tools/utilities/auto-vr-optimizer-318687 Automated utility to apply common VR optimization settings to your Unity project. Memory Profiler and Profile Analyzer Documentation: https://unity.com/how-to/use-memory-profiling-unity Capture memory snapshots, compare them, and analyze fragmentation or leaks. Outside Editor OVR Metrics Tool https://developers.meta.com/horizon/downloads/package/ovr-metrics-tool/ Capture performance metrics on device, including CPU, GPU, and memory. Meta Quest Developer Hub https://developers.meta.com/horizon/downloads/package/oculus-developer-hub-win/ Deploy to device, profile, view logs, and measure thermal and throttling behavior. RenderDoc Documentation: https://developers.meta.com/horizon/documentation/unity/ts-renderdoc-for-oculus/ Graphics frame capture and inspection of draw calls, states, and textures. Perfetto Guide: https://developers.meta.com/horizon/documentation/unity/ts-perfettoguide Open source system tracing for CPU, GPU, and OS events. Immersive Debugger https://developers.meta.com/horizon/documentation/unity/immersivedebugger-overview/ Resources Profiling and debugging tools in Unity https://unity.com/how-to/profiling-and-debugging-tools Ultimate Guide to Profiling Unity Games (e book) https://unity.com/resources/ultimate-guide-to-profiling-unity-games-unity-6 Meta XR Performance Best Practices https://developers.meta.com/horizon/documentation/unity/unity-best-practices-intro Optimization for web, XR, and mobile games in Unity 6 (YouTube) https://youtu.be/2J0kDtUGlrY?si=DfiogFWQronhdowQ Unity Optimization E book (Unity 6 edition) https://unity.com/resources/mobile-xr-web-game-performance-optimization-unity-6 Valem Tutorials How to Optimize VR Game Part 1 https://www.youtube.com/watch?v=BeB9Cx_msKA How to Optimize VR Game Part 2 https://www.youtube.com/watch?v=Jgf3F--VoPg How to Optimize VR Game Part 3 https://youtu.be/qm4-6zHkanM Hope this helps. :) What do you think? Do you have other tools or resources that should be on this list?364Views3likes0CommentsVR 103: Preparing Your App for the Meta Horizon Store
Join our team of experts in this Meta Connect 2025 series as they guide you through the fundamentals of VR development. You’ll learn how to start your project in Unity using Meta’s SDKs, explore essential building blocks for creating immersive experiences, and follow step-by-step instructions to deploy your first VR app. Whether you’re new to VR or looking to expand your skills, this series offers practical insights and hands-on demonstrations to help you succeed in building for Meta’s VR platforms. In the final session of this series, Jake Steinerman guides you through all of the steps to getting your app ready to publish on the Meta Horizon Store, and shares tips on finding and building your community. Join him in walking your app through the steps of passing VRCs, beta testing, building out a Product Detail Page, and finally, publishing your app. Your VR app is built—now let’s get it ready for the Meta Horizon Store. This launch session will guide you through everything from passing Virtual Reality Checks (VRCs) and effective beta testing to building a strong community and optimizing your product detail page. Learn proven strategies for a smooth submission, launch, and post-launch, featuring real-world examples and actionable tips to help your app thrive from day one.
130Views3likes2CommentsHow to Polish your XR Title : Onboarding, Optimization and Game Feel
Hi guys, I recently ran a workshop for the Meta Horizon Start Competition focused on polishing an XR title. At that stage of the hackathon, with only a week left before submission, one question mattered more than any other: How do you make the biggest impact in the least amount of time? This post is a written summary of that workshop aiming at answering this question around three topics: Player onboarding Optimization Game feel But why these three? Because when you are close to shipping, it becomes critical to look at your application through the eyes of a player, not a developer. For example : The build can behave very differently on device compared to the editor, which makes optimization and debugging essential. If players do not understand what to do, onboarding needs improvement. If it works but feels flat or unsatisfying, game feel needs polish. Together, these areas tend to give the highest return on investment when time is limited. Let’s start with player onboarding, using the Meta All in One SDK. Player Onboarding Player onboarding is how you teach players to understand and interact with your game. A useful principle to keep in mind is: The best tutorial is no tutorial. Ideally, players learn by doing. You introduce interactions one at a time, following the player’s natural progression through the experience. That said, refining this approach requires time and playtesting, which is often limited near the end of a project. When time is short, these faster onboarding tools can help communicate essential information clearly. Tooltips Short, spatialized text pointing at an object or interaction. Best used to highlight a short action. UI Panels The most basic solution, using Unity UI to display text, images, or short videos. Ghost Hands Very effective for hand tracking. You can visually demonstrate a hand pose or motion. Tip: With hand tracking, you can click on play, do a certain pose with your own hand, then drag the hand model from the hierarchy to the project window to save it as a prefab that you can reuse and that will directly have the pose you want to show. Voice Prompts Recording short voice lines can be faster than writing UI and often feels more natural. Keep them short and contextual. I provide a Unity package with a sample scene demonstrating each of these approaches here: https://drive.google.com/file/d/1n3IUzLMH6_60foStzgSdR5WignTkWUFh/view?usp=sharing Here are three more guidelines about onboarding : Optimization Optimization is crucial. It is not only about meeting Meta VRC requirements so your app can ship, but also about player comfort and enjoyment. Below is a simple optimization workflow that works well even for beginners and usually resolves most common issues. Step 1: Check your project settings Many performance problems come from incorrect project setup. Use the Project Setup Tool from the Meta SDK Ensure Single Pass Instanced is enabled Review URP settings and quality levels Bake your lights whenever possible For a full setup checklist, see this video: https://youtu.be/BeB9Cx_msKA?si=AjfnZdoPxH3jPxk- Project Validation Tool in Action Step 2: Check your scene complexity Triangle budgets vary depending on your content, but a good general target is around 150,000 triangles visible at once. Draw calls are just as important, and often more critical so keep them as low as possible (under 80). Ways to reduce complexity: Lower triangle count per model Use frustum and occlusion culling Add LODs to complex meshes Use batching techniques where possible For more information about the target fps, drawcalls and triangle count, go check out this meta documentation page: https://developers.meta.com/horizon/documentation/unity/unity-perf Step 3: Use OVR Metrics Install OVR Metrics and always test outside the Unity editor. OVR Metrics gives you real performance data on device, with live graphs and indicators. Here are the three most important data to track : FPS GPU and CPU usage Render Scale (must stay above 0.85 to meet VRC requirements) You can download it here : https://www.meta.com/en-gb/experiences/ovr-metrics-tool/2372625889463779/ Play your build and look for frame drops, reduced render scale, or sustained high GPU or CPU usage. Step 4: Find and remove your bottleneck The previous step tells you when and where performance drops occur. Now you need to understand why. Use the right tools: Unity Profiler Identify CPU bottlenecks, scripts taking too long, physics spikes, or garbage collection issues. Frame Debugger Analyze draw calls and rendering passes to understand what is actually being rendered each frame. Meta Runtime Optimizer Helps identify XR specific performance issues related to the runtime and rendering pipeline. Once you know the bottleneck, you can make targeted changes instead of guessing. Step 5: Last resort performance boosts When time is very limited, two features can significantly improve performance with minimal effort: Dynamic Resolution Fixed Foveated Rendering Both offer configurable levels to balance performance and visual quality. Be careful when lowering values too aggressively, visual quality can degrade quickly. Always remember that Render Scale must stay above 85 percent to pass VRC. Fixed Foveated Rendering Applied to Eye Texture Game Feel Game feel is often underestimated, but small improvements here can dramatically improve how polished your XR experience feels. Moreover having responsiveness also helps guide the player to understand your game and help onboarding. Haptic feedback Haptics add physical feedback to interactions and are extremely effective in XR. Triggering haptics is often just a single line of code. If you want more advanced effects, you can build layered patterns using a haptic tool or studio. Tutorial: https://youtu.be/RUUwWMkXFt0?si=1L92-NwIL9xy4CfJ Grab poses Hand grab poses let you enforce a custom hand pose when grabbing an object. The pose can vary depending on how or where the object is grabbed. For example, a cup grabbed by the body uses a different pose than grabbing it by the handle. This small detail greatly improves realism and comfort. Documentation: https://developers.meta.com/horizon/documentation/unity/unity-isdk-creating-handgrab-poses/ Sound design Sound is often forgotten, yet incredibly important. Add sounds to every interaction Use pitch variation for natural randomness Adjust volume based on the intensity of the action The Meta All in One SDK already includes UI focused audio, but Meta also provides a free audio pack with more general sounds: https://developers.meta.com/horizon/downloads/package/oculus-audio-pack-1/ Outro So here it is guys, these are the most impactful steps you can take to polish an XR title when time is limited. I hope this breakdown is helpful. Feel free to share your own tips and tricks below on how you polish your XR titles.70Views1like0CommentsHow to Understand your Player and their Space : Context Aware MR
Hey folks! I just ran last week a small workshop about Context-Aware MR on how to understand your player and their space. I wanted to drop a summary here, and I’m looking forward to your tips, ideas, or comments. 😊 But, here is the big question. What are the tool we can use to really “understand” a player? Here are the four musketeers of Context Aware MR : Player Input, MRUK, Depth API, PCA. 1) Player Input You might think player input is a bit trivial but there’s actually a lot we can extract from it to understand our player behaviour. Of course, we get the basics: the position and rotation of the head and controllers, plus input from buttons or triggers. But that’s just the beginning. With features like hand tracking and voice recognition, we unlock much more. Hand tracking lets us detect custom hand poses or gestures. Voice recognition allows for voice commands and even detecting voice loudness, which can be used to animate a character’s mouth or trigger actions like blowing out a candle. By combining head and controller tracking, we can figure out not only where the player is looking but also estimate their overall body pose. These are just a few examples, but the creative possibilities are huge. While these features also apply to standard VR, let’s now move to tools that are specific to Mixed Reality starting with our next musketeer : MRUK ! 2) MRUK To understand MRUK I need to first explain what the Scene Data is. The Scene Data is an approximation of the player's environment, set up outside of your app through the Meta system. It gives you access to either a full mesh or a simplified version of the room using labeled boxes that lets you identify if this elements is a wall, door, floors or a furniture. The Mixed Reality Utility Kit (MRUK) is a powerful set of tools built on top of Scene Data. It helps you place, align, and make virtual content interact with the real world. Here are some examples of what MRUK enables: Smart spawn points on specific surfaces (like the floor or walls) while avoiding obstacles (like furniture) Collision for your virtual content Navmesh to move object around the player’s space without bumping into real-world elements Destructible scene mesh effects Dynamic lighting effects on real world QR code and keyboard tracking And more... While MRUK is incredibly useful, keep in mind that Scene Data doesn’t update in real time. That’s where the Depth API comes in. 3) Depth API The Depth API gives you real-time depth maps of what the user is currently seeing. This allows you to occlude virtual objects behind real-world elements, making them feel like a natural part of the environment and greatly increasing immersion. It also comes with a Depth Raycast Manager, which lets you detect collisions at runtime with real objects perfect for dynamic content placement or interactions. It’s a great complement to the Scene Model, filling in the gaps that static scene data can’t cover. Despite its potential, it's still underused in many XR projects. 4) Passthrough Camera Access (PCA) We’ve had the first three tools for a while now. But recently, a game-changing feature was introduced: access to the passthrough camera! With access to the camera, you can: Read the live image as a texture to do color picking, light estimation, or apply visual effects like blur Feed the image to AI models for computer vision tasks like object detection It opens a direct bridge between the real world and AI and that's huge for MR development. Good news: starting with version v83, new building blocks are available to help you set up PCA easily in your project. To Conclude Player Input, MRUK, Depth API, and Passthrough Camera Access form a powerful toolbox for building context-aware MR experiences. And now, with tools like PCA, creativity is more accessible than ever. We can finally build apps and games that truly adapt to each user and their real space. Hope you enjoyed this little summary a nd that you learned something new along the way. Go check out the different link provided in the post if you want to learn more about our 4 musqueteers and if you have a tip on how you these features in your app share them down bellow! 😊 Usefuls link : https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview https://developers.meta.com/horizon/documentation/unity/unity-mr-utility-kit-overview https://developers.meta.com/horizon/documentation/unity/unity-depthapi-overview https://developers.meta.com/horizon/documentation/unity/unity-pca-overview Have a nice day! 👋62Views1like0CommentsVR 102: Beyond the Basics
Join our team of experts in this Meta Connect 2025 series as they guide you through the fundamentals of VR development. You’ll learn how to start your project in Unity using Meta’s SDKs, explore essential building blocks for creating immersive experiences, and follow step-by-step instructions to deploy your first VR app. Whether you’re new to VR or looking to expand your skills, this series offers practical insights and hands-on demonstrations to help you succeed in building for Meta’s VR platforms. This second session follows Robi Coviello as he walks you through bringing your project to life with Meta’s simulation and optimization tools. Join him in taking Blossom Buddy to the next level by adding audio cues, using the Meta XR Simulator, and working with Meta’s new AI Building Blocks to make the game even more immersive. Your first VR app is playable, so now it’s time to iterate and optimize. This hands-on session shows you how to elevate your app from prototype to polished product. You’ll master optimization tools like Meta Quest Developer Hub, XR Simulator, Building Blocks, and Immersive Debugger to enhance performance, streamline your debugging process, and add the professional touches that can make your app stand out.
32Views1like0CommentsVR 101: Getting Started with VR
Join our team of experts in this Meta Connect 2025 series as they guide you through the fundamentals of VR development. You’ll learn how to start your project in Unity using Meta’s SDKs, explore essential building blocks for creating immersive experiences, and follow step-by-step instructions to deploy your first VR app. Whether you’re new to VR or looking to expand your skills, this series offers practical insights and hands-on demonstrations to help you succeed in building for Meta’s VR platforms. This first session follows Dilmer Valecillos as he shares expert tips and tricks to simplify your project organization and set up your first Unity project using Meta’s SDKs. Join him in building Blossom Buddy—a relaxing MR experience where players can place flowers in their physical space and meet a cute robot buddy. Ready to turn your VR interest into your first creation? This session is your launchpad into VR development, covering what you need to transform your ideas into immersive experiences. We'll walk you through the must-have tools, streamline your setup process and share insider tips that can save you hours of trial and error. Whether you're dreaming of an engaging game or a groundbreaking app, this beginner-friendly session gives you the foundation to start building with confidence.
99Views1like3Comments