Stringscape: Turning Hand Distance into Pitch
I’m currently building a Quest experience called Stringscape, and I wanted to share the core idea and get feedback from other developers here. The concept is simple: You stretch a glowing “string” between your two hands, and the world-space distance between them controls the pitch. Closer hands → higher pitch Farther apart → lower pitch The experience is designed to be more of a creative playground than a structured music tool. I’d love to hear your thoughts. It’s currently in Early Access on Quest as well if anyone is curious to try it. Thanks!
21Views0likes0CommentsMixed Reality with Unity and Meta SDK Test
Hi, I have been developing in Meta Horizon since 2020 and have learned UnityXR/MR. I will graduate with a masters degree in Art and Technology in May 2026. For my final project I will be working on a Mixed Reality interaction for dyslexic learners with hand tracking. I will be applying for the smart glasses grant for accessibility. I've been in education for the past 19 years, teaching students with dyslexia for the past ten years. This video shows my first test. Link and image below. Mixed Reality Test, Quest 3: Mixed Reality Test, Unity and Meta SDK by Tina Wheeler34Views3likes0CommentsLasertag! - an experiment in live scene understanding using DepthAPI
Hello new dev forum :) I'm working on a project that uses the DepthAPI to map your space in real time (instead of relying on room setup) to decrease setup friction, lower time-to-fun, and increase playspsace area. Because the game scans as you play, it responds to opening/closing doors, moving furniture, and other changes to the environment. I'm also using depth for drawing light against the environment. It looks really nice in dimly lit areas. I'm currently working on meshing so I can use it with Unity's NPC pathfinding. I'll be posting updates this thread. You can learn more and download the game at https://anagly.ph314Views13likes9CommentsCool MR Projects Part 1: Back to the Mixed Reality!
Welcome to the first part of our series, Cool MR Projects! In this post, I will review my own Mixed Reality (MR) project, "Back to the Mixed Reality," a fan-made experience that demonstrates the potential of MR. This project allows users to virtually drive the iconic DeLorean Time Machine and experience time travel right on their floor. Mixed Reality (MR) involves the blending of physical and digital worlds, creating new environments where physical and digital objects co-exist and interact in real time. If you've ever wondered why I built this project and what challenges we faced, read on! The Motivation: Why Build a Mixed Reality Time Machine? The inspiration for "Back to the Mixed Reality" stems directly from childhood dreams. I have always been fascinated by the concept of time travel, specifically from the Back to the Future movie series. As a child, I imagined time-traveling on my bicycle, believing that if I reached a certain speed, I would travel to the time I was dreaming of. Now, using advancements in Mixed Reality with Meta Quest 3, I’ve been able to turn this childhood fantasy into a real experience. Beyond personal passion, the project serves several key goals: Demonstrating MR Potential: This project showcases the incredible potential of Mixed Reality to bring childhood dreams to life. Inspiring the Community: It is a passion project born from a love for the Back to the Future series and a fascination with MR's potential. The project is designed to inspire, entertain, and educate enthusiasts and developers about the magic of MR technology. Driving Attention to Spatial Computing: My biggest intention in building this fan-made project is to drive more and more attention to Mixed Reality, Virtual Reality, or spatial computing. The experience involves spawning the DeLorean Time Machine on your floor, setting a destination time (e.g., 20 seconds into the future), driving the vehicle to reach the required time travel speed, and then waiting for it to return from the future. The Challenges: Navigating the New Frontier of MR Development The core mechanics required detailed visual effects (VFX) using Unity Timelines, Particles, and Shaders to replicate the time travel sequence from the first movie. We also used FMOD to create adaptive car sound effects that change according to the speed and RPM of the car engine, integrating them into Unity using the FMOD Unity plugin. However, integrating Mixed Reality functionality introduced several unique hurdles: 1. Model Optimization After importing the detailed 3D models of the iconic DeLorean Time Machine and the remote controller (inspired by Doc’s remote in the first movie) into Unity, we had to perform heavy optimizations because the models were "insanely detailed". 2. Testing and Iteration Time Integrating MR functionality brought new challenges and design considerations. Testing and iterating with Room Setup was very time-consuming because every time a change was made, it required testing within the headset. Fortunately, this challenge was mitigated by discovering the Meta XR Simulator. This tool literally saved hours of time, allowing me to test the XR project without constantly wearing the headset by simulating headset movement and touch controller input using a keyboard, mouse, or game controller. 3. Ensuring Realistic Collisions and Boundaries A crucial challenge was ensuring the virtual car remained within the room's boundaries and had realistic collisions with the physical floor and real-life objects. To achieve this, we relied on Scene Understanding—a cutting-edge technology providing a comprehensive Scene Model for geometric and semantic representation of the physical space. We had to: Identify room elements (like Floor, ceiling, Wall Face, etc.) using the OVRSemanticClassification class. Add colliders to the room's walls and floors, which required careful adjustment and testing. Tag the corresponding colliders based on the type of room element to ensure the car properly collided with real-world objects. 4. Managing the App Flow and Room Setup For Mixed Reality experiences using Meta Quest 3, the user must proceed with initial settings before loading the experience. The experience will not work without the user performing a Room Setup. We created a Lobby Scene to manage these MR procedures: The user must first give permission to access spatial data, which is necessary for the app to utilize Scene Understanding. If the user has not completed the Room Setup, they are prompted to do so before the game scene is loaded. We utilized knowledge gleaned from Project Phanto (a Unity-based Mixed Reality reference app from Meta demonstrating critical features like scene mesh and general app flow) to build the general app flow. This included informing the user about Scene Mesh visualization once the setup is complete. Join the Adventure! With all these mechanics, I successfully built a working Time Machine. I hope this project drives more attention to the magic of Mixed Reality. If you want to play the experience on your own Quest 3, you can get the build for FREE! You can also watch the deep dive YouTube video here! Let's drive into the future, together! – Tevfik (Creator of Back to the Mixed Reality)31Views0likes0CommentsWandtag (Lasertag fork) for a university student orientation event.
Every year, Purdue University accepts proposals for experiences for incoming freshmen as part of their orientation program. Fusion Studio BGR entertainment challenge My submission this year was a fork of my laser tag project swapping out the laser guns for voice-activated magic wands. Watch the video with sound on! 🔊 Players yell one of three spells − fireball, lightning, or shield − and compete to score the most eliminations against the opposing team in two minutes. This version brings all the tech from the base game and adds voice-activated spells using Vosk, an on-device speech recognition model, and a spatial localization system using the camera API to track multiple AprilTags around the room (instead of spatial anchors). I can't release it publicly in its current state, but I'll likely merge the wand weapon back into the base game in the future. more details here >>>71Views1like1CommentRecreating Meta’s new AR glasses on my Quest
Hey guys, here’s a cool project I did last week that I wanted to share : Recreating Meta’s new AR glasses on my Quest. 😎 This project reproduces the new wristband using microgestures to navigate through the UI. I also built my own hand tracking implementation for the pinch and twist mechanism, which controls the volume of the audio player and the zoom of the camera just like in the keynote. But my favourite addition is definitely the contextual AI that lets me send whatever I’m looking at to an AI and instantly get more information. This was pretty fun to do, but also helped me think about how future experiences could be designed for this new device! https://www.linkedin.com/feed/update/urn:li:activity:7377329573287391232/ https://www.linkedin.com/feed/update/urn:li:activity:7376702804146429953/131Views3likes2Comments