Unified Social Sharing Between Facebook and Instagram
Dear Meta Product Team, I hope you are doing well. My name is Veeranna Angadi, and I would like to share a product idea that could significantly improve user experience across your platforms. Currently, users actively engage with both Facebook and Instagram, but they are still treated as separate ecosystems when it comes to social sharing. For example, if a user enjoys a reel on Facebook and wants to share it with their Instagram friends, the only available option is sharing via a link. This creates friction and reduces seamless engagement. Proposed Idea: Unified Social Graph & Cross-Platform Sharing I propose introducing a feature that allows: Integration of Facebook friends and Instagram followers into a unified social layer (with user consent and privacy controls) Direct sharing of content (reels, posts, stories) across both platforms without needing external links Optional “Cross-Platform Audience” selection while sharing content Smart suggestions (e.g., “Share this reel with your Instagram close friends”) Benefits: Increased content engagement and retention within Meta platforms Reduced friction in content sharing Stronger ecosystem integration between Facebook and Instagram Improved user satisfaction and time spent on apps Additional Enhancement: A recommendation engine could identify which platform a user’s friends are more active on and suggest optimized sharing for better reach. I believe this idea aligns with Meta’s vision of building connected experiences across platforms. If this idea is considered valuable, I would appreciate recognition or the opportunity to collaborate further. Thank you for your time and consideration. I would be happy to discuss this idea in more detail if required. Warm regards, Veeranna Angadi Bangalore, India41Views0likes2CommentsBuilding a VR Burr Puzzler: Interlocked
Hi everyone! For a while now I've been working on a burr-puzzle mechanic in VR, specifically for Meta Quest. I finally have a trailer and a playable and I want to share some initial thoughts: The game is called Interlocked: Puzzle Islands, and it's based on a Flash(!) and a mobile game mechanic I did a while back. The new game revisits this concept in VR with new visuals, a little story arc, and 30 licensed puzzles by burr puzzle designers from around the world. And I was lucky enough to ship the game to the Meta Quest Store along with the VR Games Showcase! It was awesome seeing it during the live stream. The main struggle at the moment is building a community around the game. I had to put future updates on hold (hand tracking, puzzles packs, etc.) until I figure this one out. I've been getting positive feedback overall, including a review from UploadVR, but it's hard to get traction in this niche for a single player game. I'm well aware discoverability is always a challenge, but I'd be super interested to hear how other devs manage building a community around single player Meta Quest games and learn more. What are some best practices for a Meta Quest Store game? Interlocked: Puzzle Islands- https://www.meta.com/en-gb/experiences/interlocked-puzzle-islands/7115743118544777/ Thanks!
46Views0likes2CommentsStringscape: Turning Hand Distance into Pitch
I’m currently building a Quest experience called Stringscape, and I wanted to share the core idea and get feedback from other developers here. The concept is simple: You stretch a glowing “string” between your two hands, and the world-space distance between them controls the pitch. Closer hands → higher pitch Farther apart → lower pitch The experience is designed to be more of a creative playground than a structured music tool. I’d love to hear your thoughts. It’s currently in Early Access on Quest as well if anyone is curious to try it. Thanks!
21Views0likes0CommentsMixed Reality with Unity and Meta SDK Test
Hi, I have been developing in Meta Horizon since 2020 and have learned UnityXR/MR. I will graduate with a masters degree in Art and Technology in May 2026. For my final project I will be working on a Mixed Reality interaction for dyslexic learners with hand tracking. I will be applying for the smart glasses grant for accessibility. I've been in education for the past 19 years, teaching students with dyslexia for the past ten years. This video shows my first test. Link and image below. Mixed Reality Test, Quest 3: Mixed Reality Test, Unity and Meta SDK by Tina Wheeler34Views3likes0CommentsImmersive Exposure -a VR-native photography playground.
Hello devs 👋my name is Corey Reese. App Trailer I’m building Immersive Exposure, a VR-native creative sandbox that turns photography into an interactive, game-like experience inside VR. Instead of just watching tutorials or shooting static scenes, users step inside environments, control virtual cameras, lighting, and lenses, and photograph dynamic subjects in real time. The core idea is simple: Photography as play, exploration, and mastery not passive learning. Current focus Refining spatial interaction and camera feel so it’s intuitive for both creators and casual VR users. Adding mixed reality support, allowing users to practice with virtual models and lighting in their real spaces. Expanding NPC systems with AI so characters can respond, pose, and interact with voice commands. Designing repeatable engagement loops (photo challenges, exploration goals, unlockable environments). Right now we are fine tuning the camera UX so the camera controls feel good and respond as close to a real camera as possible with depth of field etc. The project started as an immersive training tool featuring real photographers 30+ shoots across fashion, boudoir, food, commercial, etc filmed in 8k VR180 so user can go behind the scenes on real shoots and has evolved into a creative playground for photographers who enjoy exploration, expression, and practicing anytime. I feel we have the same opportunity as Golf + who doubled down on real golf users but with the photography community. I’ll be posting development updates, experiments, and lessons learned here as things progress. Looking forward to learning from others building. If your interested in testing the alpha build I can send the link so I can get feedback. I've already identified a lot of things that need to be corrected. It's good to get some new eyes as well.21Views0likes0CommentsStarting a New Series: Building a Social VR Game From Scratch (Baby VR)
I just launched Episode 1 of a new video series where I’m building a Social VR game from scratch — live, in public, and together with the community. The project is called Baby VR: a chaotic, social VR playground where players embody babies with Gorilla-Tag-style locomotion, tiny legs, and big personalities. In this first episode, I break down the 4-step process I use to start any VR project: Ideation — finding a spark worth building Validation — checking the market before writing code Forever-Updatable Test — designing for longevity Realistic Scope — defining an MVP you can actually ship This same process applies whether you’re building your first VR prototype or scaling a social experience. What makes this series different: 👉 It’s community-driven. Viewers help decide the game modes, mechanics, and even the first map — and their ideas may ship into the real game. If you’re working in VR / XR / game development, or curious how social VR titles are actually planned and scoped, I think you’ll find this useful. 🎥 Episode 1 is live: https://www.youtube.com/watch?v=kIjpDFuGScE I’d also love to hear your thoughts: What’s the most important thing you consider when starting a multiplayer or social product? #SocialVR #VRDevelopment #GameDevelopment #Unity #PhotonFusion #MetaQuest #IndieDev #XR #Startups75Views3likes2CommentsXR.Movement SDK Errors (Unity)(Photon Fusion2)
when I build a [Block]networked character retargeter,like below: 1.select model .fbx as a custom avatar,like below: 3.and save as a prefabs,like below: 4.and like below 5.OVRcamera setup like below: 6.i setup photon fusion2 like below: 7.i run the app,encounter erLike below:44Views0likes1CommentLasertag! - an experiment in live scene understanding using DepthAPI
Hello new dev forum :) I'm working on a project that uses the DepthAPI to map your space in real time (instead of relying on room setup) to decrease setup friction, lower time-to-fun, and increase playspsace area. Because the game scans as you play, it responds to opening/closing doors, moving furniture, and other changes to the environment. I'm also using depth for drawing light against the environment. It looks really nice in dimly lit areas. I'm currently working on meshing so I can use it with Unity's NPC pathfinding. I'll be posting updates this thread. You can learn more and download the game at https://anagly.ph314Views13likes9CommentsCool MR Projects Part 1: Back to the Mixed Reality!
Welcome to the first part of our series, Cool MR Projects! In this post, I will be reviewing my own Mixed Reality (MR) project: "Back to the Mixed Reality," an incredible fan-made experience that demonstrates the potential of MR. This project allows users to virtually drive the iconic DeLorean Time Machine and experience time travel right on their floor. Mixed Reality (MR) involves the blending of physical and digital worlds, creating new environments where physical and digital objects co-exist and interact in real time. If you've ever wondered why I built this project and what challenges we faced, read on! The Motivation: Why Build a Mixed Reality Time Machine? The inspiration for "Back to the Mixed Reality" stems directly from childhood dreams. I have always been fascinated by the concept of time travel, specifically from the Back to the Future movie series. As a child, I imagined time-traveling on my bicycle, believing that if I reached a certain speed, I would travel to the time I was dreaming of. Now, using advancements in Mixed Reality, I’ve been able to turn this childhood fantasy into a real experience. Beyond personal passion, the project serves several key goals: Demonstrating MR Potential: This project showcases the incredible potential of Mixed Reality in bringing those childhood dreams to life. Inspiring the Community: It is a passion project born from a love for the Back to the Future series and a fascination with MR's potential. The project is designed to inspire, entertain, and educate enthusiasts and developers about the magic of MR technology. Driving Attention to Spatial Computing: My biggest intention in building this fan-made project is to drive more and more attention to Mixed Reality, Virtual Reality, or spatial computing. The experience itself involves spawning the DeLorean Time Machine on your floor, setting a destination time (e.g., 20 seconds into the future), driving the vehicle to reach the required time travel speed, and then waiting for it to return from the future. The Challenges: Navigating the New Frontier of MR Development The core mechanics required detailed visual effects (VFX) using Unity Timelines, Particles, and Shaders to replicate the time travel sequence from the first movie. We also used FMOD to create adaptive car sound effects that change according to the speed and RPM of the car engine, integrating them into Unity using the FMOD Unity plugin. However, integrating Mixed Reality functionality introduced several unique hurdles: 1. Model Optimization After importing the detailed 3D models of the iconic DeLorean Time Machine and the remote controller (inspired by Doc’s remote in the first movie) into Unity, we had to perform heavy optimizations because the models were "insanely detailed". 2. Testing and Iteration Time Integrating MR functionality brought new challenges and design considerations. Testing and iterating with Room Setup was very time-consuming because every time a change was made, it required testing within the headset. Fortunately, this challenge was mitigated by discovering the Meta XR Simulator. This tool literally saved time, allowing us to test the XR project without constantly wearing the headset by simulating headset movement and touch controller input using a keyboard, mouse, or game controller. 3. Ensuring Realistic Collisions and Boundaries A crucial challenge was ensuring the virtual car remained within the room's boundaries and had realistic collisions with the physical floor and real-life objects. To achieve this, we relied on Scene Understanding—a cutting-edge technology providing a comprehensive Scene Model for geometric and semantic representation of the physical space. We had to: Identify room elements (like Floor, ceiling, Wall Face, etc.) using the OVRSemanticClassification class. Add colliders to the room's walls and floors, which required careful adjustment and testing. Tag the corresponding colliders based on the type of room element to ensure the car properly collided with real-world objects. 4. Managing the App Flow and Room Setup For Mixed Reality experiences using Meta Quest 3, the user must proceed with initial settings before loading the experience. The experience will not work without the user performing a Room Setup. We created a Lobby Scene to manage these MR procedures: The user must first give permission to access spatial data, which is necessary for the app to utilize Scene Understanding. If the user has not completed the Room Setup, they are prompted to do so before the game scene is loaded. We utilized knowledge gleaned from Project Phanto (a Unity-based Mixed Reality reference app from Meta demonstrating critical features like scene mesh and general app flow) to build the general app flow. This included informing the user about Scene Mesh visualization once the setup is complete. Join the Adventure! With all these mechanics, we successfully built a working Time Machine. I hope this project drives more attention to the magic of Mixed Reality. If you want to play the experience on your own Quest 3, you can get the build for FREE! I encourage you to share your gameplay on social media to help spread the word about MR. Let's drive into the future, together! – Tevfik (Creator of Back to the Mixed Reality)146Views3likes2Comments