Mixed Reality with Unity and Meta SDK Test
Hi, I have been developing in Meta Horizon since 2020 and have learned UnityXR/MR. I will graduate with a masters degree in Art and Technology in May 2026. For my final project I will be working on a Mixed Reality interaction for dyslexic learners with hand tracking. I will be applying for the smart glasses grant for accessibility. I've been in education for the past 19 years, teaching students with dyslexia for the past ten years. This video shows my first test. Link and image below. Mixed Reality Test, Quest 3: Mixed Reality Test, Unity and Meta SDK by Tina Wheeler34Views3likes0CommentsBuilding NE9: A Runtime and App for AI-Driven Interactive 3D and XR Worlds
Hey everyone, I wanted to share what I’m currently building and open it up for discussion. I’m developing NE9, also known as NastyEngine 9. It’s a modular, real-time runtime designed to integrate AI systems, 3D environments, and interactive applications into a single live pipeline. Alongside NE9, I’m building a companion app that interfaces directly with the runtime. The goal is to use it as a control and integration layer where scene logic, agents, and interaction can be composed and updated live instead of being locked into a traditional editor workflow. The core idea is to treat AI, rendering, networking, and interaction as runtime-orchestrated systems rather than isolated tools. This approach makes it easier to experiment, iterate, and eventually extend into XR and VR environments. This is an active build and the architecture is evolving quickly. I’ll be sharing progress, experiments, and lessons learned as things continue to come together. This screenshot shows where we are right now in the development process. This was our first full session using Meta Quest 3 connected to our desktop via USB, running the Meta desktop app as our development workspace. We were viewing and working directly with our existing tools inside the headset to get a real sense of scale, comfort, and workflow. It was our first serious hands-on development session this way, and getting our feet wet was a lot of fun. Even just working from the desktop inside the headset made it clear that this is a platform we’re excited to build for. We’re looking forward to transitioning from desktop-based development into deeper Horizon and native XR workflows as NE9 continues to evolve. If you’d like to connect, feel free to check out my LinkedIn. Thanks for stopping by, and I’m excited to see what we can build together. https://www.linkedin.com/in/daniel-harris-0745b8374/19Views1like0CommentsCool MR Projects Part 1: Back to the Mixed Reality!
Welcome to the first part of our series, Cool MR Projects! In this post, I will review my own Mixed Reality (MR) project, "Back to the Mixed Reality," a fan-made experience that demonstrates the potential of MR. This project allows users to virtually drive the iconic DeLorean Time Machine and experience time travel right on their floor. Mixed Reality (MR) involves the blending of physical and digital worlds, creating new environments where physical and digital objects co-exist and interact in real time. If you've ever wondered why I built this project and what challenges we faced, read on! The Motivation: Why Build a Mixed Reality Time Machine? The inspiration for "Back to the Mixed Reality" stems directly from childhood dreams. I have always been fascinated by the concept of time travel, specifically from the Back to the Future movie series. As a child, I imagined time-traveling on my bicycle, believing that if I reached a certain speed, I would travel to the time I was dreaming of. Now, using advancements in Mixed Reality with Meta Quest 3, I’ve been able to turn this childhood fantasy into a real experience. Beyond personal passion, the project serves several key goals: Demonstrating MR Potential: This project showcases the incredible potential of Mixed Reality to bring childhood dreams to life. Inspiring the Community: It is a passion project born from a love for the Back to the Future series and a fascination with MR's potential. The project is designed to inspire, entertain, and educate enthusiasts and developers about the magic of MR technology. Driving Attention to Spatial Computing: My biggest intention in building this fan-made project is to drive more and more attention to Mixed Reality, Virtual Reality, or spatial computing. The experience involves spawning the DeLorean Time Machine on your floor, setting a destination time (e.g., 20 seconds into the future), driving the vehicle to reach the required time travel speed, and then waiting for it to return from the future. The Challenges: Navigating the New Frontier of MR Development The core mechanics required detailed visual effects (VFX) using Unity Timelines, Particles, and Shaders to replicate the time travel sequence from the first movie. We also used FMOD to create adaptive car sound effects that change according to the speed and RPM of the car engine, integrating them into Unity using the FMOD Unity plugin. However, integrating Mixed Reality functionality introduced several unique hurdles: 1. Model Optimization After importing the detailed 3D models of the iconic DeLorean Time Machine and the remote controller (inspired by Doc’s remote in the first movie) into Unity, we had to perform heavy optimizations because the models were "insanely detailed". 2. Testing and Iteration Time Integrating MR functionality brought new challenges and design considerations. Testing and iterating with Room Setup was very time-consuming because every time a change was made, it required testing within the headset. Fortunately, this challenge was mitigated by discovering the Meta XR Simulator. This tool literally saved hours of time, allowing me to test the XR project without constantly wearing the headset by simulating headset movement and touch controller input using a keyboard, mouse, or game controller. 3. Ensuring Realistic Collisions and Boundaries A crucial challenge was ensuring the virtual car remained within the room's boundaries and had realistic collisions with the physical floor and real-life objects. To achieve this, we relied on Scene Understanding—a cutting-edge technology providing a comprehensive Scene Model for geometric and semantic representation of the physical space. We had to: Identify room elements (like Floor, ceiling, Wall Face, etc.) using the OVRSemanticClassification class. Add colliders to the room's walls and floors, which required careful adjustment and testing. Tag the corresponding colliders based on the type of room element to ensure the car properly collided with real-world objects. 4. Managing the App Flow and Room Setup For Mixed Reality experiences using Meta Quest 3, the user must proceed with initial settings before loading the experience. The experience will not work without the user performing a Room Setup. We created a Lobby Scene to manage these MR procedures: The user must first give permission to access spatial data, which is necessary for the app to utilize Scene Understanding. If the user has not completed the Room Setup, they are prompted to do so before the game scene is loaded. We utilized knowledge gleaned from Project Phanto (a Unity-based Mixed Reality reference app from Meta demonstrating critical features like scene mesh and general app flow) to build the general app flow. This included informing the user about Scene Mesh visualization once the setup is complete. Join the Adventure! With all these mechanics, I successfully built a working Time Machine. I hope this project drives more attention to the magic of Mixed Reality. If you want to play the experience on your own Quest 3, you can get the build for FREE! You can also watch the deep dive YouTube video here! Let's drive into the future, together! – Tevfik (Creator of Back to the Mixed Reality)32Views0likes0CommentsRecreating Meta’s new AR glasses on my Quest
Hey guys, here’s a cool project I did last week that I wanted to share : Recreating Meta’s new AR glasses on my Quest. 😎 This project reproduces the new wristband using microgestures to navigate through the UI. I also built my own hand tracking implementation for the pinch and twist mechanism, which controls the volume of the audio player and the zoom of the camera just like in the keynote. But my favourite addition is definitely the contextual AI that lets me send whatever I’m looking at to an AI and instantly get more information. This was pretty fun to do, but also helped me think about how future experiences could be designed for this new device! https://www.linkedin.com/feed/update/urn:li:activity:7377329573287391232/ https://www.linkedin.com/feed/update/urn:li:activity:7376702804146429953/131Views3likes2Comments