Forum Widgets
Featured Content
Recent Activity
Mixed Reality with Unity and Meta SDK Test
Hi, I have been developing in Meta Horizon since 2020 and have learned UnityXR/MR. I will graduate with a masters degree in Art and Technology in May 2026. For my final project I will be working on a Mixed Reality interaction for dyslexic learners with hand tracking. I will be applying for the smart glasses grant for accessibility. I've been in education for the past 19 years, teaching students with dyslexia for the past ten years. This video shows my first test. Link and image below. Mixed Reality Test, Quest 3: Mixed Reality Test, Unity and Meta SDK by Tina Wheelertmwheeler16 days agoMHCP Partner0likes0CommentsBlockView - a spatial perception puzzle
Hi everyone, I wanted to share a small VR project I’ve been working on as a solo developer. BlockView is a spatial perception–focused puzzle game built for Meta Quest. Each puzzle asks the player to place 3D blocks so that the shape matches top, front, back, and side silhouettes at the same time. My goal was to create a calm, focused puzzle that rewards careful thinking. From a development perspective, it was interesting to explore how players interpret multiple orthographic views in VR, and how small UI and interaction changes affected their spatial reasoning and comfort. Here are a few in-game screenshots. BlockView is now available on the Meta Quest Store: https://www.meta.com/en-gb/experiences/blockview/33062523903362946/ I’d love to hear any feedback.lefthanddeveloper24 days agoExplorer0likes0CommentsBuilding NE9: A Runtime and App for AI-Driven Interactive 3D and XR Worlds
Hey everyone, I wanted to share what I’m currently building and open it up for discussion. I’m developing NE9, also known as NastyEngine 9. It’s a modular, real-time runtime designed to integrate AI systems, 3D environments, and interactive applications into a single live pipeline. Alongside NE9, I’m building a companion app that interfaces directly with the runtime. The goal is to use it as a control and integration layer where scene logic, agents, and interaction can be composed and updated live instead of being locked into a traditional editor workflow. The core idea is to treat AI, rendering, networking, and interaction as runtime-orchestrated systems rather than isolated tools. This approach makes it easier to experiment, iterate, and eventually extend into XR and VR environments. This is an active build and the architecture is evolving quickly. I’ll be sharing progress, experiments, and lessons learned as things continue to come together. This screenshot shows where we are right now in the development process. This was our first full session using Meta Quest 3 connected to our desktop via USB, running the Meta desktop app as our development workspace. We were viewing and working directly with our existing tools inside the headset to get a real sense of scale, comfort, and workflow. It was our first serious hands-on development session this way, and getting our feet wet was a lot of fun. Even just working from the desktop inside the headset made it clear that this is a platform we’re excited to build for. We’re looking forward to transitioning from desktop-based development into deeper Horizon and native XR workflows as NE9 continues to evolve. If you’d like to connect, feel free to check out my LinkedIn. Thanks for stopping by, and I’m excited to see what we can build together. https://www.linkedin.com/in/daniel-harris-0745b8374/TheOneAndOnly11730 days agoMHCP Member0likes0CommentsImmersive Exposure -a VR-native photography playground.
Hello devs 👋my name is Corey Reese. App Trailer I’m building Immersive Exposure, a VR-native creative sandbox that turns photography into an interactive, game-like experience inside VR. Instead of just watching tutorials or shooting static scenes, users step inside environments, control virtual cameras, lighting, and lenses, and photograph dynamic subjects in real time. The core idea is simple: Photography as play, exploration, and mastery not passive learning. Current focus Refining spatial interaction and camera feel so it’s intuitive for both creators and casual VR users. Adding mixed reality support, allowing users to practice with virtual models and lighting in their real spaces. Expanding NPC systems with AI so characters can respond, pose, and interact with voice commands. Designing repeatable engagement loops (photo challenges, exploration goals, unlockable environments). Right now we are fine tuning the camera UX so the camera controls feel good and respond as close to a real camera as possible with depth of field etc. The project started as an immersive training tool featuring real photographers 30+ shoots across fashion, boudoir, food, commercial, etc filmed in 8k VR180 so user can go behind the scenes on real shoots and has evolved into a creative playground for photographers who enjoy exploration, expression, and practicing anytime. I feel we have the same opportunity as Golf + who doubled down on real golf users but with the photography community. I’ll be posting development updates, experiments, and lessons learned here as things progress. Looking forward to learning from others building. If your interested in testing the alpha build I can send the link so I can get feedback. I've already identified a lot of things that need to be corrected. It's good to get some new eyes as well.ImmersiveExposure1 month agoStart Partner0likes0CommentsSDK Preview Request
Project: https://wearables.developer.meta.com/devcenter/projects/897444649397117 App: GlassesViewer (HVAC training camera viewer) Error: 401 on maven.pkg.github.com/facebook/meta-wearables-dat-android Like: https://github.com/wolverin0/remote-pair-eyes (same SDK) Request: Maven token + preview accesstrainingcgh1 month agoHonored Guest0likes1CommentNew patented solution how to control full-body avatar movements by Meta Quest 3
Our PAO-XR startup helps to settle associations between the end-user body gestures language and switching different types of full-body avatar movements animations. All associations are stored in the end-user's personal DB on his mobile or desktop device. So entire on-line interaction between the end-user and his avatar demands only a MOCAP device like Meta Quest 3 ( in spite of absence low-body MOCAP) and a standard mobile device. To implement the pilot project, we are seeking partners with developers of their own virtual simulations, focusing on full-bodied avatars controlled by end users (players) using the Meta Quest 3 suite, including mobile apps. For these interested developers, we are willing to invest in the implementation of our technology in their applications. For more details, please see our promo demo on our PAO-XR.COM website.Daniil.Kofner1 month agoHonored Guest1like0CommentsBooleans
I've done a lot of rendering and model building, specifically homes and kitchens with Blender. I am trying to import a .fbx file. I started with a complete foundation that included booleans to the walls. While in Blender I applied the boolean and exported only the selected foundation. I imported really funky. So, I decided to simplify and import a single wall with an applied boolean for a window. It imported in with planes on the front and back of the boolean location. The boolean cube extended past both sides and top of the wall in Blender. And, the Normals were good. What am I missing?rcrusselldesigns.20241 month agoMHCP Member0likes0CommentsStarting a New Series: Building a Social VR Game From Scratch (Baby VR)
I just launched Episode 1 of a new video series where I’m building a Social VR game from scratch — live, in public, and together with the community. The project is called Baby VR: a chaotic, social VR playground where players embody babies with Gorilla-Tag-style locomotion, tiny legs, and big personalities. In this first episode, I break down the 4-step process I use to start any VR project: Ideation — finding a spark worth building Validation — checking the market before writing code Forever-Updatable Test — designing for longevity Realistic Scope — defining an MVP you can actually ship This same process applies whether you’re building your first VR prototype or scaling a social experience. What makes this series different: 👉 It’s community-driven. Viewers help decide the game modes, mechanics, and even the first map — and their ideas may ship into the real game. If you’re working in VR / XR / game development, or curious how social VR titles are actually planned and scoped, I think you’ll find this useful. 🎥 Episode 1 is live: https://www.youtube.com/watch?v=kIjpDFuGScE I’d also love to hear your thoughts: What’s the most important thing you consider when starting a multiplayer or social product? #SocialVR #VRDevelopment #GameDevelopment #Unity #PhotonFusion #MetaQuest #IndieDev #XR #StartupsDevBrain1 month agoStart Mentor3likes2CommentsXR.Movement SDK Errors (Unity)(Photon Fusion2)
when I build a [Block]networked character retargeter,like below: 1.select model .fbx as a custom avatar,like below: 3.and save as a prefabs,like below: 4.and like below 5.OVRcamera setup like below: 6.i setup photon fusion2 like below: 7.i run the app,encounter erLike below:zhangjin20251 month agoHonored Guest0likes1Comment
→ Find helpful resources to begin your development journey in Getting Started
→ Get the latest information about HorizonOS development in News & Announcements.
→ Access Start program mentor videos and share knowledge, tutorials, and videos in Community Resources.
→ Get support or provide help in Questions & Discussions.
→ Looking for documentation? Developer Docs
→ Looking for account support? Support Center
→ Looking for the previous forum? Forum Archive
→ Looking to join the Start program? Apply here.