Drone Core Command
Summary: Drone Core Command is a gesture-based pet combat game where players command their pet mech to take control of the battlefield. Players can customize their loadout / mech abilities using 'drone cores', and will be able to repair their mechs after a battle back at their hub. This first week of development focused on establishing viability for the concept and main features that will be used in combat. Players can use their left hand to spawn a series of drone cores to pick from, and install them in their gun mounted on their right arm either for direct damage or mech targeting. Features: Inventory - players can gesture to open and select a drone core from their inventory to place in their arm gun Arm gun - either a source of direct damage or targeting for the players pet drone Mech stats - a display mounted on the player's inventory arm that will show the state of the mech and any selected drone cores being used Autonomous vs Player assisted targeting - if players decide to use a drone core that is for mech control, they can point at their desired targets to control their mech Map Overview - this will be the way players can move their mech around the field at greater distances (not sure if this feature will make it all the way to competition end, but am thinking it's still worth testing to explore additional game-board like gesture control in a boardgame like setting). At the very least, the map overview will provide battlefield stats UX Challenges Context sensitive and accurate gesture detection is pretty challenging. Hand poses that are detected at the right time, and in the right context, are an important part of the game feeling intuitive and easy to play. This includes designing the game so the player never needs to block visibility of the sensors that might hitch the detection of the player's hands. No player movement - this game is designed specifically to bring the world to the player rather than move the player around the world. Players won't need to move around as the focus will be on loadout interfaces and battlefield control from a distance. Next Steps Player journey - building out the framework of the game so players can start at a main menu scene, load into their hub, deploy to the battlefield, and return to the hub. Building the basic functionality for the hub - this is where players can select their drone cores from their library, (or maybe even augment their drone core for additional player agency), select and review a mission map, and repair their mech from previous battle damage. A stretch goal here would be to build out the gesture-based mech repair system Map overview iteration - am hoping to build a simple pick-up-and-place-the-mech type system on the battlefield map to move the mech around larger distances, but maintaining good player visibility for enemy targeting. Inspiration IP - have always enjoyed mech-type IP. My early days of gaming were inclusive of running around in a timberwolf. Input - I did a lot of research in motion-controller interactions when I built MageWorks (VR game on Quest), and targeting systems when I built BlastPoint (mobile AR game). Being able to take the learnings to the next step with gestures is part of my motivation for this project. Gameplay - was looking to combine pet-based combat/player agency (e.g. as exhibited in WoW for pet-based players classes like the hunter and warlock) with a hybrid strategic/tactical style of play much like old-school games such as final fantasy advance tactics. (that's where i want to try moving the mech around to different quadrants/regions as a form of strategy to both keep it alive, and giving players a sense of target priority at both a macro and micro scale. Credits While i'm working on this project as a solo dev, i'm using the marketplace for art, sound, and animation assets as placeholders while i focus on scripting the game in UE5. Many thanks to the marketplace community, forum contributors, and overall XR community for the many resources they have made available over the years. If you made it this far, thanks for reading, and feel free to reach out with any questions or feedback. Once I get a build up and running in a pre-alpha channel on the Quest app store, am happy to add folks to the build for testing. In the meantime, will try and keep this thread updated as the game progresses.
149Views0likes8CommentsBuilding a VR Burr Puzzler: Interlocked
Hi everyone! For a while now I've been working on a burr-puzzle mechanic in VR, specifically for Meta Quest. I finally have a trailer and a playable and I want to share some initial thoughts: The game is called Interlocked: Puzzle Islands, and it's based on a Flash(!) and a mobile game mechanic I did a while back. The new game revisits this concept in VR with new visuals, a little story arc, and 30 licensed puzzles by burr puzzle designers from around the world. And I was lucky enough to ship the game to the Meta Quest Store along with the VR Games Showcase! It was awesome seeing it during the live stream. The main struggle at the moment is building a community around the game. I had to put future updates on hold (hand tracking, puzzles packs, etc.) until I figure this one out. I've been getting positive feedback overall, including a review from UploadVR, but it's hard to get traction in this niche for a single player game. I'm well aware discoverability is always a challenge, but I'd be super interested to hear how other devs manage building a community around single player Meta Quest games and learn more. What are some best practices for a Meta Quest Store game? Interlocked: Puzzle Islands- https://www.meta.com/en-gb/experiences/interlocked-puzzle-islands/7115743118544777/ Thanks!
50Views1like2CommentsImmersive Exposure -a VR-native photography playground.
Hello devs 👋my name is Corey Reese. App Trailer I’m building Immersive Exposure, a VR-native creative sandbox that turns photography into an interactive, game-like experience inside VR. Instead of just watching tutorials or shooting static scenes, users step inside environments, control virtual cameras, lighting, and lenses, and photograph dynamic subjects in real time. The core idea is simple: Photography as play, exploration, and mastery not passive learning. Current focus Refining spatial interaction and camera feel so it’s intuitive for both creators and casual VR users. Adding mixed reality support, allowing users to practice with virtual models and lighting in their real spaces. Expanding NPC systems with AI so characters can respond, pose, and interact with voice commands. Designing repeatable engagement loops (photo challenges, exploration goals, unlockable environments). Right now we are fine tuning the camera UX so the camera controls feel good and respond as close to a real camera as possible with depth of field etc. The project started as an immersive training tool featuring real photographers 30+ shoots across fashion, boudoir, food, commercial, etc filmed in 8k VR180 so user can go behind the scenes on real shoots and has evolved into a creative playground for photographers who enjoy exploration, expression, and practicing anytime. I feel we have the same opportunity as Golf + who doubled down on real golf users but with the photography community. I’ll be posting development updates, experiments, and lessons learned here as things progress. Looking forward to learning from others building. If your interested in testing the alpha build I can send the link so I can get feedback. I've already identified a lot of things that need to be corrected. It's good to get some new eyes as well.21Views0likes0CommentsStarting a New Series: Building a Social VR Game From Scratch (Baby VR)
I just launched Episode 1 of a new video series where I’m building a Social VR game from scratch — live, in public, and together with the community. The project is called Baby VR: a chaotic, social VR playground where players embody babies with Gorilla-Tag-style locomotion, tiny legs, and big personalities. In this first episode, I break down the 4-step process I use to start any VR project: Ideation — finding a spark worth building Validation — checking the market before writing code Forever-Updatable Test — designing for longevity Realistic Scope — defining an MVP you can actually ship This same process applies whether you’re building your first VR prototype or scaling a social experience. What makes this series different: 👉 It’s community-driven. Viewers help decide the game modes, mechanics, and even the first map — and their ideas may ship into the real game. If you’re working in VR / XR / game development, or curious how social VR titles are actually planned and scoped, I think you’ll find this useful. 🎥 Episode 1 is live: https://www.youtube.com/watch?v=kIjpDFuGScE I’d also love to hear your thoughts: What’s the most important thing you consider when starting a multiplayer or social product? #SocialVR #VRDevelopment #GameDevelopment #Unity #PhotonFusion #MetaQuest #IndieDev #XR #Startups76Views3likes2Comments