AI building blocks in unity
Hello, In one of Meta's QuickStart tutorials, they talk about AI building blocks like object detection, TTS, STT etc. But in unity i cant seem to find them under Meta XR tools. Can someone help me with this? The video: https://www.youtube.com/watch?v=V7VGbt55VBE Thanks,4Views0likes0CommentsMeta XR Simulator does not appear
Pullin my hair out here - I have the V81 simulator installed on Windows 11. Unity 6.2.10f1. I enable the simulator but the window does not appear. Any suggestions on what to check? My goal is to use the headset in editor and the simulator in the Multiplayer play mode. When using XRIT and the unity xr simulator in another project, this type setup works quite well . I can have 3 virtual players using the simulator and I can use the actual headset over the pc link in the editor. I want to replicate this type of setup but using the Meta XR Simulator instead.Solved65Views0likes5CommentsCool MR Projects Part 1: Back to the Mixed Reality!
Welcome to the first part of our series, Cool MR Projects! In this post, I will be reviewing my own Mixed Reality (MR) project: "Back to the Mixed Reality," an incredible fan-made experience that demonstrates the potential of MR. This project allows users to virtually drive the iconic DeLorean Time Machine and experience time travel right on their floor. Mixed Reality (MR) involves the blending of physical and digital worlds, creating new environments where physical and digital objects co-exist and interact in real time. If you've ever wondered why I built this project and what challenges we faced, read on! The Motivation: Why Build a Mixed Reality Time Machine? The inspiration for "Back to the Mixed Reality" stems directly from childhood dreams. I have always been fascinated by the concept of time travel, specifically from the Back to the Future movie series. As a child, I imagined time-traveling on my bicycle, believing that if I reached a certain speed, I would travel to the time I was dreaming of. Now, using advancements in Mixed Reality, I’ve been able to turn this childhood fantasy into a real experience. Beyond personal passion, the project serves several key goals: Demonstrating MR Potential: This project showcases the incredible potential of Mixed Reality in bringing those childhood dreams to life. Inspiring the Community: It is a passion project born from a love for the Back to the Future series and a fascination with MR's potential. The project is designed to inspire, entertain, and educate enthusiasts and developers about the magic of MR technology. Driving Attention to Spatial Computing: My biggest intention in building this fan-made project is to drive more and more attention to Mixed Reality, Virtual Reality, or spatial computing. The experience itself involves spawning the DeLorean Time Machine on your floor, setting a destination time (e.g., 20 seconds into the future), driving the vehicle to reach the required time travel speed, and then waiting for it to return from the future. The Challenges: Navigating the New Frontier of MR Development The core mechanics required detailed visual effects (VFX) using Unity Timelines, Particles, and Shaders to replicate the time travel sequence from the first movie. We also used FMOD to create adaptive car sound effects that change according to the speed and RPM of the car engine, integrating them into Unity using the FMOD Unity plugin. However, integrating Mixed Reality functionality introduced several unique hurdles: 1. Model Optimization After importing the detailed 3D models of the iconic DeLorean Time Machine and the remote controller (inspired by Doc’s remote in the first movie) into Unity, we had to perform heavy optimizations because the models were "insanely detailed". 2. Testing and Iteration Time Integrating MR functionality brought new challenges and design considerations. Testing and iterating with Room Setup was very time-consuming because every time a change was made, it required testing within the headset. Fortunately, this challenge was mitigated by discovering the Meta XR Simulator. This tool literally saved time, allowing us to test the XR project without constantly wearing the headset by simulating headset movement and touch controller input using a keyboard, mouse, or game controller. 3. Ensuring Realistic Collisions and Boundaries A crucial challenge was ensuring the virtual car remained within the room's boundaries and had realistic collisions with the physical floor and real-life objects. To achieve this, we relied on Scene Understanding—a cutting-edge technology providing a comprehensive Scene Model for geometric and semantic representation of the physical space. We had to: Identify room elements (like Floor, ceiling, Wall Face, etc.) using the OVRSemanticClassification class. Add colliders to the room's walls and floors, which required careful adjustment and testing. Tag the corresponding colliders based on the type of room element to ensure the car properly collided with real-world objects. 4. Managing the App Flow and Room Setup For Mixed Reality experiences using Meta Quest 3, the user must proceed with initial settings before loading the experience. The experience will not work without the user performing a Room Setup. We created a Lobby Scene to manage these MR procedures: The user must first give permission to access spatial data, which is necessary for the app to utilize Scene Understanding. If the user has not completed the Room Setup, they are prompted to do so before the game scene is loaded. We utilized knowledge gleaned from Project Phanto (a Unity-based Mixed Reality reference app from Meta demonstrating critical features like scene mesh and general app flow) to build the general app flow. This included informing the user about Scene Mesh visualization once the setup is complete. Join the Adventure! With all these mechanics, we successfully built a working Time Machine. I hope this project drives more attention to the magic of Mixed Reality. If you want to play the experience on your own Quest 3, you can get the build for FREE! I encourage you to share your gameplay on social media to help spread the word about MR. Let's drive into the future, together! – Tevfik (Creator of Back to the Mixed Reality)41Views3likes1CommentCan I use the OVR Manager alongside Unity's XR Origin Rig ?
Hello ! I'm a student making a game on Meta Quest 3. I'm leveraging Unity' XR Interaction Toolkit to make most of the game but I'm wanted to enable some Meta Quest native features. Notably, I wanted to use the OVROverlayCanvases but this imply using the OVR Manager as well. I desperately attempted to find any source of information about using both Unity's built-in XR systems and the ones provided by the Meta XR Core SDK, to no avail so far. Is dealing with both simultaneously a no-go or is there some critical configuration to make it work ? Does Unity's XR Origin or the OVR Manager take precedence over the other (specifically on the Tracking origin mode) ? Thanks a lot for your help and your time !9Views0likes0CommentsCannot set Meta Quest Link as active OpenXR runtime normally
Hello! We are developing a PC-VR app in Unity with Meta Quest 3 at our company. We use OpenXR plugin to build the code around OpenXR standard in order to support more platforms in the future. However, the button to set the meta quest link as active runtime for open xr never works on our test PCs. We've tested multiple laptops & PCs in the companies and none of them seem to be work without us manually setting the path in the registry. We've tried starting the Link app as an admin, installed it as an admin etc. One time the registry entry doesn't even get added automatically after installing Meta Quest Link App and we had to install "OpenXR for Windows Mixed Reality" to create it without writing a script to automatically create the group entry ("HKLM\Software\Khronos\OpenXR\1") in the registry. Could anyone help clarify what is up with this case and how we can resolve it? Otherwise the setup process is basically impossible for our potential customers without in-depth technical support... Thank you! References: https://community.khronos.org/t/openxr-directory-not-existing/11118437Views0likes3CommentsIs it possible to use hand tracking with just one hand?
Hi everyone! I’ve been experimenting with ways to give players a more super-real experience. After attending a gesture recognition session yesterday, I started thinking about removing the hand controllers entirely, but there’s a problem: without them, I can’t let players move through hand tracking alone. My game includes boss fights that require precise movement to dodge attacks. So I’m wondering, is it currently possible to have one hand use gesture tracking while the other hand uses a controller joystick for movement?13Views0likes0CommentsThe Complete List of Sample Unity VR Projects
Hey guys, I wanted to put together a list of my favorite sample projects that you can grab and learn from. In my opinion, these projects are pure goldmines, they don’t just showcase design principles around specific features but also provide direct examples of how to use them, which is especially important right now for something like a hackathon. For an even larger collection of Meta samples, see the GitHub list of all Meta sample repos here: https://github.com/orgs/oculus-samples/repositories?type=all Let’s start with our first category, the interaction samples. Interaction Samples Meta XR All-In-One (Interaction SDK) Sample Links: https://github.com/oculus-samples/Unity-InteractionSDK-Samples https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 Description: A comprehensive demo from Meta’s XR Interaction SDK featuring core VR interactions like poking, grabbing, raycasting, UI, and locomotion, all working together. Perfect to understand how to integrate both hands and controllers in one system. First Hand Link: https://github.com/oculus-samples/Unity-FirstHand Description: A full VR game demo focused on hand-tracked interactions. It showcases a complete Unity experience using the Interaction SDK with hand tracking as the main input and controller fallback. XR Interaction Toolkit Examples Link: https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples Description: Unity’s official XR Interaction Toolkit samples showing how to implement locomotion, selection, grabbing, and UI interactions. A solid starting point for setting up XR Origin and interactor/interactable components. Move Fast Link: https://github.com/oculus-samples/Unity-MoveFast Description: A fast-paced VR fitness demo using hand tracking and the Interaction SDK. The sample shows how to build an energetic workout game with responsive, punch-based interactions. Whisperer Link: https://github.com/oculus-samples/voicesdk-samples-whisperer Description: A voice-controlled VR experience demonstrating the Meta Voice SDK. Use voice commands as part of gameplay to learn how to integrate real-time voice recognition into your own projects. Tilt Brush (Open Brush) Link: https://github.com/icosa-foundation/open-brush Description: An open-source continuation of Google’s Tilt Brush. Lets users paint and sculpt in 3D space — an excellent reference for creative VR tools and spatial drawing. Multiplayer & Social Samples VR Multiplayer Sample (Unity XRI) Link: https://docs.unity3d.com/Packages/com.unity.template.vr-multiplayer@2.0/manual/index.html Description: Unity’s official multiplayer VR template featuring a prebuilt scene, avatars, and networking setup using Netcode for GameObjects. Great for learning multi-user interactions in VR. Mixed Reality Multiplayer (XR Multiplayer) Sample Link: https://docs.unity3d.com/Packages/com.unity.template.mr-multiplayer@1.0/manual/index.html Description: A tabletop MR multiplayer demo that includes avatars, voice chat, and shared AR/VR spaces. Features games like balloon slingshot and chess while teaching MR networking and colocation concepts. Tiny Golf Link: https://github.com/Meta-Horizon-Start-Program/Tiny-Golf Description: A free-to-play multiplayer mini-golf VR game created for the Meta Start program. Demonstrates basic physics, scoring, and networked multiplayer. Ultimate Glove Ball Link: https://github.com/oculus-samples/Unity-UltimateGloveBall Description: A VR e-sport showcase demonstrating multiplayer, avatars, voice, and in-app purchases. Integrates Photon networking and Oculus social APIs, making it a great reference for social competitive games. Spirit Sling Link: https://github.com/oculus-samples/Unity-SpiritSling Description: A social MR tabletop game letting players place a shared game board in real space and invite friends to join. Highlights Avatars SDK and MR colocated play. Decommissioned Link: https://github.com/oculus-samples/Unity-Decommissioned Description: A social-deduction VR game inspired by titles like Among Us. Shows how to handle multiplayer lobbies, Oculus invites, and social APIs in a networked Unity project. Mixed Reality (MR) Samples A World Beyond (Presence Platform Demo) Link: https://github.com/oculus-samples/Unity-TheWorldBeyond Description: A full MR showcase combining Scene Understanding, Passthrough, hand tracking, voice input, and spatial audio. A must-see for developers building immersive MR scenes blending real and virtual spaces. Phanto (MR Reference App) Links: https://github.com/oculus-samples/Unity-Phanto https://developers.meta.com/horizon/blog/phanto-unreal-showcase/ Description: An MR reference app focused on environmental awareness. Uses the Scene Mesh and MR APIs to blend gameplay with real-world geometry. Unity Discover (featuring Drone Rage and others) Links: https://www.meta.com/en-gb/experiences/discover/7041851792509764/ https://github.com/oculus-samples/Unity-Discover Description: A collection of MR showcase mini-experiences like Drone Rage. Demonstrates MR features including Passthrough, Spatial Anchors, and Shared Anchors in various game prototypes. MR Motifs Link: https://github.com/oculus-samples/Unity-MRMotifs Description: A library of MR “motifs”, small, reusable templates showcasing mechanics such as passthrough transitions, colocated multiplayer, and instant content placement. Cryptic Cabinet Link: https://github.com/oculus-samples/Unity-CrypticCabinet Description: A short MR escape-room experience that adapts to your room’s layout. Demonstrates interactive storytelling in mixed reality using environmental awareness. Passthrough Camera API Samples Link: https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples Description: A sample project demonstrating how to access and process Quest’s Passthrough camera feed for effects, object detection, and image manipulation. Tool and Utilities Asset Streaming Link: https://github.com/oculus-samples/Unity-AssetStreaming Description: An open-world streaming sample that shows how to dynamically load content using Addressables and LOD systems — ideal for maintaining performance in large VR environments. Shader Prewarmer Link: https://github.com/oculus-samples/Unity-ShaderPrewarmer Description: A utility sample that preloads shader variants at startup to eliminate hitching or stutters when shaders first compile — an important optimization for smooth VR performance. Complete Game Showcase Northstar Link: https://github.com/oculus-samples/Unity-NorthStar Description: A complete VR game showcasing advanced interaction and visual technique for VR. Featuring rope physics, narrative story telling, lip sync and more.68Views6likes2CommentsMaterial Read-Only Issue & Shader Error (76 vs 66 keywords) Fix
Hey everyone I ran into a recurring shader issue while working with Meta materials in Unity, especially when adding objects like a Cube from Meta Building Blocks. The material was read-only, so I couldn’t change its shader manually. Because of that, I kept getting a shader keyword mismatch between Meta/Lit and Universal Render Pipeline/Lit. Here’s how I fixed it ✅ I created a new material, then copied the material properties from the Meta material (so it had the same properties and color) and pasted them into the new one. Then, I assigned the new material as the parent in a Material Variant by clicking the three dots next to the shader material and selecting “Create Variant for Renderer.” Once I did that, the shader automatically switched to Universal Render Pipeline/Lit, and the error disappeared It worked perfectly, but I’m curious if this is considered a good practice or just a workaround? Sharing this here in case someone else faces the same problem while working with Meta Building Blocks and URP. Hope this helps someone!Solved163Views1like3CommentsLasertag! - an experiment in live scene understanding using DepthAPI
Hello new dev forum :) I'm working on a project that uses the DepthAPI to map your space in real time (instead of relying on room setup) to decrease setup friction, lower time-to-fun, and increase playspsace area. Because the game scans as you play, it responds to opening/closing doors, moving furniture, and other changes to the environment. I'm also using depth for drawing light against the environment. It looks really nice in dimly lit areas. I'm currently working on meshing so I can use it with Unity's NPC pathfinding. I'll be posting updates this thread. You can learn more and download the game at https://anagly.ph193Views10likes8Comments