Drone Core Command
Summary: Drone Core Command is a gesture-based pet combat game where players command their pet mech to take control of the battlefield. Players can customize their loadout / mech abilities using 'drone cores', and will be able to repair their mechs after a battle back at their hub. This first week of development focused on establishing viability for the concept and main features that will be used in combat. Players can use their left hand to spawn a series of drone cores to pick from, and install them in their gun mounted on their right arm either for direct damage or mech targeting. Features: Inventory - players can gesture to open and select a drone core from their inventory to place in their arm gun Arm gun - either a source of direct damage or targeting for the players pet drone Mech stats - a display mounted on the player's inventory arm that will show the state of the mech and any selected drone cores being used Autonomous vs Player assisted targeting - if players decide to use a drone core that is for mech control, they can point at their desired targets to control their mech Map Overview - this will be the way players can move their mech around the field at greater distances (not sure if this feature will make it all the way to competition end, but am thinking it's still worth testing to explore additional game-board like gesture control in a boardgame like setting). At the very least, the map overview will provide battlefield stats UX Challenges Context sensitive and accurate gesture detection is pretty challenging. Hand poses that are detected at the right time, and in the right context, are an important part of the game feeling intuitive and easy to play. This includes designing the game so the player never needs to block visibility of the sensors that might hitch the detection of the player's hands. No player movement - this game is designed specifically to bring the world to the player rather than move the player around the world. Players won't need to move around as the focus will be on loadout interfaces and battlefield control from a distance. Next Steps Player journey - building out the framework of the game so players can start at a main menu scene, load into their hub, deploy to the battlefield, and return to the hub. Building the basic functionality for the hub - this is where players can select their drone cores from their library, (or maybe even augment their drone core for additional player agency), select and review a mission map, and repair their mech from previous battle damage. A stretch goal here would be to build out the gesture-based mech repair system Map overview iteration - am hoping to build a simple pick-up-and-place-the-mech type system on the battlefield map to move the mech around larger distances, but maintaining good player visibility for enemy targeting. Inspiration IP - have always enjoyed mech-type IP. My early days of gaming were inclusive of running around in a timberwolf. Input - I did a lot of research in motion-controller interactions when I built MageWorks (VR game on Quest), and targeting systems when I built BlastPoint (mobile AR game). Being able to take the learnings to the next step with gestures is part of my motivation for this project. Gameplay - was looking to combine pet-based combat/player agency (e.g. as exhibited in WoW for pet-based players classes like the hunter and warlock) with a hybrid strategic/tactical style of play much like old-school games such as final fantasy advance tactics. (that's where i want to try moving the mech around to different quadrants/regions as a form of strategy to both keep it alive, and giving players a sense of target priority at both a macro and micro scale. Credits While i'm working on this project as a solo dev, i'm using the marketplace for art, sound, and animation assets as placeholders while i focus on scripting the game in UE5. Many thanks to the marketplace community, forum contributors, and overall XR community for the many resources they have made available over the years. If you made it this far, thanks for reading, and feel free to reach out with any questions or feedback. Once I get a build up and running in a pre-alpha channel on the Quest app store, am happy to add folks to the build for testing. In the meantime, will try and keep this thread updated as the game progresses.
147Views0likes8CommentsBuilding a VR Burr Puzzler: Interlocked
Hi everyone! For a while now I've been working on a burr-puzzle mechanic in VR, specifically for Meta Quest. I finally have a trailer and a playable and I want to share some initial thoughts: The game is called Interlocked: Puzzle Islands, and it's based on a Flash(!) and a mobile game mechanic I did a while back. The new game revisits this concept in VR with new visuals, a little story arc, and 30 licensed puzzles by burr puzzle designers from around the world. And I was lucky enough to ship the game to the Meta Quest Store along with the VR Games Showcase! It was awesome seeing it during the live stream. The main struggle at the moment is building a community around the game. I had to put future updates on hold (hand tracking, puzzles packs, etc.) until I figure this one out. I've been getting positive feedback overall, including a review from UploadVR, but it's hard to get traction in this niche for a single player game. I'm well aware discoverability is always a challenge, but I'd be super interested to hear how other devs manage building a community around single player Meta Quest games and learn more. What are some best practices for a Meta Quest Store game? Interlocked: Puzzle Islands- https://www.meta.com/en-gb/experiences/interlocked-puzzle-islands/7115743118544777/ Thanks!
46Views0likes2CommentsLooking for feedback on a small web tool I built
Hi everyone, I’ve been experimenting with building small web tools recently just to practice and learn. One of the things I built is a simple browser-based counter that can be used to count clicks, track repetitions, or test clicking speed. You can check it here: https://thetallycounter.com/ The goal was to keep it lightweight and simple so it works quickly on both desktop and mobile without needing to install anything. I’d really appreciate any feedback from other developers. Does the interface feel easy to use? Are there any features you think would make it better? Thanks in advance for any suggestions.78Views1like4CommentsMeta Horizons for Creators
I am a modern musician, producer and project manager looking to build a stage and city for virtual characters. I think Meta Horizons has a lot of potential to create new digital games, immersive entertainment where music and performances comes to life in new ways, storytelling / film making techniques, marketing value and global communication. I am researching how practical applications of new technology affects the real world. I am looking for collaborators.25Views0likes0CommentsStringscape: Turning Hand Distance into Pitch
I’m currently building a Quest experience called Stringscape, and I wanted to share the core idea and get feedback from other developers here. The concept is simple: You stretch a glowing “string” between your two hands, and the world-space distance between them controls the pitch. Closer hands → higher pitch Farther apart → lower pitch The experience is designed to be more of a creative playground than a structured music tool. I’d love to hear your thoughts. It’s currently in Early Access on Quest as well if anyone is curious to try it. Thanks!
21Views0likes0CommentsBlockView - a spatial perception puzzle
Hi everyone, I wanted to share a small VR project I’ve been working on as a solo developer. BlockView is a spatial perception–focused puzzle game built for Meta Quest. Each puzzle asks the player to place 3D blocks so that the shape matches top, front, back, and side silhouettes at the same time. My goal was to create a calm, focused puzzle that rewards careful thinking. From a development perspective, it was interesting to explore how players interpret multiple orthographic views in VR, and how small UI and interaction changes affected their spatial reasoning and comfort. Here are a few in-game screenshots. BlockView is now available on the Meta Quest Store: https://www.meta.com/en-gb/experiences/blockview/33062523903362946/ I’d love to hear any feedback.29Views0likes0CommentsBuilding NE9: A Runtime and App for AI-Driven Interactive 3D and XR Worlds
Hey everyone, I wanted to share what I’m currently building and open it up for discussion. I’m developing NE9, also known as NastyEngine 9. It’s a modular, real-time runtime designed to integrate AI systems, 3D environments, and interactive applications into a single live pipeline. Alongside NE9, I’m building a companion app that interfaces directly with the runtime. The goal is to use it as a control and integration layer where scene logic, agents, and interaction can be composed and updated live instead of being locked into a traditional editor workflow. The core idea is to treat AI, rendering, networking, and interaction as runtime-orchestrated systems rather than isolated tools. This approach makes it easier to experiment, iterate, and eventually extend into XR and VR environments. This is an active build and the architecture is evolving quickly. I’ll be sharing progress, experiments, and lessons learned as things continue to come together. This screenshot shows where we are right now in the development process. This was our first full session using Meta Quest 3 connected to our desktop via USB, running the Meta desktop app as our development workspace. We were viewing and working directly with our existing tools inside the headset to get a real sense of scale, comfort, and workflow. It was our first serious hands-on development session this way, and getting our feet wet was a lot of fun. Even just working from the desktop inside the headset made it clear that this is a platform we’re excited to build for. We’re looking forward to transitioning from desktop-based development into deeper Horizon and native XR workflows as NE9 continues to evolve. If you’d like to connect, feel free to check out my LinkedIn. Thanks for stopping by, and I’m excited to see what we can build together. https://www.linkedin.com/in/daniel-harris-0745b8374/19Views1like0CommentsImmersive Exposure -a VR-native photography playground.
Hello devs 👋my name is Corey Reese. App Trailer I’m building Immersive Exposure, a VR-native creative sandbox that turns photography into an interactive, game-like experience inside VR. Instead of just watching tutorials or shooting static scenes, users step inside environments, control virtual cameras, lighting, and lenses, and photograph dynamic subjects in real time. The core idea is simple: Photography as play, exploration, and mastery not passive learning. Current focus Refining spatial interaction and camera feel so it’s intuitive for both creators and casual VR users. Adding mixed reality support, allowing users to practice with virtual models and lighting in their real spaces. Expanding NPC systems with AI so characters can respond, pose, and interact with voice commands. Designing repeatable engagement loops (photo challenges, exploration goals, unlockable environments). Right now we are fine tuning the camera UX so the camera controls feel good and respond as close to a real camera as possible with depth of field etc. The project started as an immersive training tool featuring real photographers 30+ shoots across fashion, boudoir, food, commercial, etc filmed in 8k VR180 so user can go behind the scenes on real shoots and has evolved into a creative playground for photographers who enjoy exploration, expression, and practicing anytime. I feel we have the same opportunity as Golf + who doubled down on real golf users but with the photography community. I’ll be posting development updates, experiments, and lessons learned here as things progress. Looking forward to learning from others building. If your interested in testing the alpha build I can send the link so I can get feedback. I've already identified a lot of things that need to be corrected. It's good to get some new eyes as well.21Views0likes0Comments