VR Music Trainer
Hello! I wanted to share my project: Vonias/VR-Music-Trainer I have successfully created a small application that displays music to the VR Quest that the musician can read and perform. The VR APP has a Heads Up Display, that shows a tuner, and dynamics indicator (how loud your playing). I'm really happy with what I have so far, but it lacks appeal. It's very basic. Check it out! VR Performance50Views0likes2CommentsUnified Social Sharing Between Facebook and Instagram
Dear Meta Product Team, I hope you are doing well. My name is Veeranna Angadi, and I would like to share a product idea that could significantly improve user experience across your platforms. Currently, users actively engage with both Facebook and Instagram, but they are still treated as separate ecosystems when it comes to social sharing. For example, if a user enjoys a reel on Facebook and wants to share it with their Instagram friends, the only available option is sharing via a link. This creates friction and reduces seamless engagement. Proposed Idea: Unified Social Graph & Cross-Platform Sharing I propose introducing a feature that allows: Integration of Facebook friends and Instagram followers into a unified social layer (with user consent and privacy controls) Direct sharing of content (reels, posts, stories) across both platforms without needing external links Optional “Cross-Platform Audience” selection while sharing content Smart suggestions (e.g., “Share this reel with your Instagram close friends”) Benefits: Increased content engagement and retention within Meta platforms Reduced friction in content sharing Stronger ecosystem integration between Facebook and Instagram Improved user satisfaction and time spent on apps Additional Enhancement: A recommendation engine could identify which platform a user’s friends are more active on and suggest optimized sharing for better reach. I believe this idea aligns with Meta’s vision of building connected experiences across platforms. If this idea is considered valuable, I would appreciate recognition or the opportunity to collaborate further. Thank you for your time and consideration. I would be happy to discuss this idea in more detail if required. Warm regards, Veeranna Angadi Bangalore, India41Views0likes2CommentsModel Everything, I Guess?
Alright, Meta AI, hear me out. A little background: I'm a small-scale web developer, videographer, and photographer with about 10 years of experience. Over the past year, I took something of a gap year to dive into AI and research the technology. This led me to a project that's in line with my expertise: real estate and architectural photography, which is basically about capturing spaces. My idea is to train a model using a combination of Instagram & Facebook images (with their tagged locations, of course) and 3D terrain models from multi-date satellite images (In the future: add property listing images). By combining these, we could create a virtual, living, breathing digital twin of the entire real world. This model would map both the outside world and most public indoor spaces, likely requiring a staggered rollout. I know this would require a metric ton of servers. I'm currently tinkering with a small-scale prototype, but I don't have the resources to scale this monster. The biggest limiting factor is being from South Africa (born in '98). Our government isn't exactly tech-friendly; instead of encouraging new tech, they create roadblocks and regulations that stifle innovation, all to protect corrupt, archaic monopolies. It's infuriating. Meta already has all the data needed for this model. So, if anyone at Meta AI sees this, please build this thing. And hey, maybe hook a man up with a tiny royalty? 😉 P.S. I would love to build this with you. Seriously, hit me up. Kind Regards, T South Africa27Views1like0CommentsMixed Reality with Unity and Meta SDK Test
Hi, I have been developing in Meta Horizon since 2020 and have learned UnityXR/MR. I will graduate with a masters degree in Art and Technology in May 2026. For my final project I will be working on a Mixed Reality interaction for dyslexic learners with hand tracking. I will be applying for the smart glasses grant for accessibility. I've been in education for the past 19 years, teaching students with dyslexia for the past ten years. This video shows my first test. Link and image below. Mixed Reality Test, Quest 3: Mixed Reality Test, Unity and Meta SDK by Tina Wheeler34Views3likes0CommentsStarting a New Series: Building a Social VR Game From Scratch (Baby VR)
I just launched Episode 1 of a new video series where I’m building a Social VR game from scratch — live, in public, and together with the community. The project is called Baby VR: a chaotic, social VR playground where players embody babies with Gorilla-Tag-style locomotion, tiny legs, and big personalities. In this first episode, I break down the 4-step process I use to start any VR project: Ideation — finding a spark worth building Validation — checking the market before writing code Forever-Updatable Test — designing for longevity Realistic Scope — defining an MVP you can actually ship This same process applies whether you’re building your first VR prototype or scaling a social experience. What makes this series different: 👉 It’s community-driven. Viewers help decide the game modes, mechanics, and even the first map — and their ideas may ship into the real game. If you’re working in VR / XR / game development, or curious how social VR titles are actually planned and scoped, I think you’ll find this useful. 🎥 Episode 1 is live: https://www.youtube.com/watch?v=kIjpDFuGScE I’d also love to hear your thoughts: What’s the most important thing you consider when starting a multiplayer or social product? #SocialVR #VRDevelopment #GameDevelopment #Unity #PhotonFusion #MetaQuest #IndieDev #XR #Startups75Views3likes2CommentsCool MR Projects Part 1: Back to the Mixed Reality!
Welcome to the first part of our series, Cool MR Projects! In this post, I will be reviewing my own Mixed Reality (MR) project: "Back to the Mixed Reality," an incredible fan-made experience that demonstrates the potential of MR. This project allows users to virtually drive the iconic DeLorean Time Machine and experience time travel right on their floor. Mixed Reality (MR) involves the blending of physical and digital worlds, creating new environments where physical and digital objects co-exist and interact in real time. If you've ever wondered why I built this project and what challenges we faced, read on! The Motivation: Why Build a Mixed Reality Time Machine? The inspiration for "Back to the Mixed Reality" stems directly from childhood dreams. I have always been fascinated by the concept of time travel, specifically from the Back to the Future movie series. As a child, I imagined time-traveling on my bicycle, believing that if I reached a certain speed, I would travel to the time I was dreaming of. Now, using advancements in Mixed Reality, I’ve been able to turn this childhood fantasy into a real experience. Beyond personal passion, the project serves several key goals: Demonstrating MR Potential: This project showcases the incredible potential of Mixed Reality in bringing those childhood dreams to life. Inspiring the Community: It is a passion project born from a love for the Back to the Future series and a fascination with MR's potential. The project is designed to inspire, entertain, and educate enthusiasts and developers about the magic of MR technology. Driving Attention to Spatial Computing: My biggest intention in building this fan-made project is to drive more and more attention to Mixed Reality, Virtual Reality, or spatial computing. The experience itself involves spawning the DeLorean Time Machine on your floor, setting a destination time (e.g., 20 seconds into the future), driving the vehicle to reach the required time travel speed, and then waiting for it to return from the future. The Challenges: Navigating the New Frontier of MR Development The core mechanics required detailed visual effects (VFX) using Unity Timelines, Particles, and Shaders to replicate the time travel sequence from the first movie. We also used FMOD to create adaptive car sound effects that change according to the speed and RPM of the car engine, integrating them into Unity using the FMOD Unity plugin. However, integrating Mixed Reality functionality introduced several unique hurdles: 1. Model Optimization After importing the detailed 3D models of the iconic DeLorean Time Machine and the remote controller (inspired by Doc’s remote in the first movie) into Unity, we had to perform heavy optimizations because the models were "insanely detailed". 2. Testing and Iteration Time Integrating MR functionality brought new challenges and design considerations. Testing and iterating with Room Setup was very time-consuming because every time a change was made, it required testing within the headset. Fortunately, this challenge was mitigated by discovering the Meta XR Simulator. This tool literally saved time, allowing us to test the XR project without constantly wearing the headset by simulating headset movement and touch controller input using a keyboard, mouse, or game controller. 3. Ensuring Realistic Collisions and Boundaries A crucial challenge was ensuring the virtual car remained within the room's boundaries and had realistic collisions with the physical floor and real-life objects. To achieve this, we relied on Scene Understanding—a cutting-edge technology providing a comprehensive Scene Model for geometric and semantic representation of the physical space. We had to: Identify room elements (like Floor, ceiling, Wall Face, etc.) using the OVRSemanticClassification class. Add colliders to the room's walls and floors, which required careful adjustment and testing. Tag the corresponding colliders based on the type of room element to ensure the car properly collided with real-world objects. 4. Managing the App Flow and Room Setup For Mixed Reality experiences using Meta Quest 3, the user must proceed with initial settings before loading the experience. The experience will not work without the user performing a Room Setup. We created a Lobby Scene to manage these MR procedures: The user must first give permission to access spatial data, which is necessary for the app to utilize Scene Understanding. If the user has not completed the Room Setup, they are prompted to do so before the game scene is loaded. We utilized knowledge gleaned from Project Phanto (a Unity-based Mixed Reality reference app from Meta demonstrating critical features like scene mesh and general app flow) to build the general app flow. This included informing the user about Scene Mesh visualization once the setup is complete. Join the Adventure! With all these mechanics, we successfully built a working Time Machine. I hope this project drives more attention to the magic of Mixed Reality. If you want to play the experience on your own Quest 3, you can get the build for FREE! I encourage you to share your gameplay on social media to help spread the word about MR. Let's drive into the future, together! – Tevfik (Creator of Back to the Mixed Reality)146Views3likes2CommentsCool MR Projects Part 1: Back to the Mixed Reality!
Welcome to the first part of our series, Cool MR Projects! In this post, I will review my own Mixed Reality (MR) project, "Back to the Mixed Reality," a fan-made experience that demonstrates the potential of MR. This project allows users to virtually drive the iconic DeLorean Time Machine and experience time travel right on their floor. Mixed Reality (MR) involves the blending of physical and digital worlds, creating new environments where physical and digital objects co-exist and interact in real time. If you've ever wondered why I built this project and what challenges we faced, read on! The Motivation: Why Build a Mixed Reality Time Machine? The inspiration for "Back to the Mixed Reality" stems directly from childhood dreams. I have always been fascinated by the concept of time travel, specifically from the Back to the Future movie series. As a child, I imagined time-traveling on my bicycle, believing that if I reached a certain speed, I would travel to the time I was dreaming of. Now, using advancements in Mixed Reality with Meta Quest 3, I’ve been able to turn this childhood fantasy into a real experience. Beyond personal passion, the project serves several key goals: Demonstrating MR Potential: This project showcases the incredible potential of Mixed Reality to bring childhood dreams to life. Inspiring the Community: It is a passion project born from a love for the Back to the Future series and a fascination with MR's potential. The project is designed to inspire, entertain, and educate enthusiasts and developers about the magic of MR technology. Driving Attention to Spatial Computing: My biggest intention in building this fan-made project is to drive more and more attention to Mixed Reality, Virtual Reality, or spatial computing. The experience involves spawning the DeLorean Time Machine on your floor, setting a destination time (e.g., 20 seconds into the future), driving the vehicle to reach the required time travel speed, and then waiting for it to return from the future. The Challenges: Navigating the New Frontier of MR Development The core mechanics required detailed visual effects (VFX) using Unity Timelines, Particles, and Shaders to replicate the time travel sequence from the first movie. We also used FMOD to create adaptive car sound effects that change according to the speed and RPM of the car engine, integrating them into Unity using the FMOD Unity plugin. However, integrating Mixed Reality functionality introduced several unique hurdles: 1. Model Optimization After importing the detailed 3D models of the iconic DeLorean Time Machine and the remote controller (inspired by Doc’s remote in the first movie) into Unity, we had to perform heavy optimizations because the models were "insanely detailed". 2. Testing and Iteration Time Integrating MR functionality brought new challenges and design considerations. Testing and iterating with Room Setup was very time-consuming because every time a change was made, it required testing within the headset. Fortunately, this challenge was mitigated by discovering the Meta XR Simulator. This tool literally saved hours of time, allowing me to test the XR project without constantly wearing the headset by simulating headset movement and touch controller input using a keyboard, mouse, or game controller. 3. Ensuring Realistic Collisions and Boundaries A crucial challenge was ensuring the virtual car remained within the room's boundaries and had realistic collisions with the physical floor and real-life objects. To achieve this, we relied on Scene Understanding—a cutting-edge technology providing a comprehensive Scene Model for geometric and semantic representation of the physical space. We had to: Identify room elements (like Floor, ceiling, Wall Face, etc.) using the OVRSemanticClassification class. Add colliders to the room's walls and floors, which required careful adjustment and testing. Tag the corresponding colliders based on the type of room element to ensure the car properly collided with real-world objects. 4. Managing the App Flow and Room Setup For Mixed Reality experiences using Meta Quest 3, the user must proceed with initial settings before loading the experience. The experience will not work without the user performing a Room Setup. We created a Lobby Scene to manage these MR procedures: The user must first give permission to access spatial data, which is necessary for the app to utilize Scene Understanding. If the user has not completed the Room Setup, they are prompted to do so before the game scene is loaded. We utilized knowledge gleaned from Project Phanto (a Unity-based Mixed Reality reference app from Meta demonstrating critical features like scene mesh and general app flow) to build the general app flow. This included informing the user about Scene Mesh visualization once the setup is complete. Join the Adventure! With all these mechanics, I successfully built a working Time Machine. I hope this project drives more attention to the magic of Mixed Reality. If you want to play the experience on your own Quest 3, you can get the build for FREE! You can also watch the deep dive YouTube video here! Let's drive into the future, together! – Tevfik (Creator of Back to the Mixed Reality)31Views0likes0CommentsPersonal VR/AR device.
So, I have seen the Apple and Android devices that have recently come out. If meta is planning on developing their own that might work... they could consider making the bulk section completely capable of separating in half and covering the ears while staying firmly part of the head space, and acting like speakers when separated. Thus, allowing the vision to remain available without taking off the headset and being able to be reimmersed when put back together. I would recommend using magnets for holding the headset together and comparing the ergonomics of different strength magnets.78Views0likes5Comments