Drone Core Command
Summary: Drone Core Command is a gesture-based pet combat game where players command their pet mech to take control of the battlefield. Players can customize their loadout / mech abilities using 'drone cores', and will be able to repair their mechs after a battle back at their hub. This first week of development focused on establishing viability for the concept and main features that will be used in combat. Players can use their left hand to spawn a series of drone cores to pick from, and install them in their gun mounted on their right arm either for direct damage or mech targeting. Features: Inventory - players can gesture to open and select a drone core from their inventory to place in their arm gun Arm gun - either a source of direct damage or targeting for the players pet drone Mech stats - a display mounted on the player's inventory arm that will show the state of the mech and any selected drone cores being used Autonomous vs Player assisted targeting - if players decide to use a drone core that is for mech control, they can point at their desired targets to control their mech Map Overview - this will be the way players can move their mech around the field at greater distances (not sure if this feature will make it all the way to competition end, but am thinking it's still worth testing to explore additional game-board like gesture control in a boardgame like setting). At the very least, the map overview will provide battlefield stats UX Challenges Context sensitive and accurate gesture detection is pretty challenging. Hand poses that are detected at the right time, and in the right context, are an important part of the game feeling intuitive and easy to play. This includes designing the game so the player never needs to block visibility of the sensors that might hitch the detection of the player's hands. No player movement - this game is designed specifically to bring the world to the player rather than move the player around the world. Players won't need to move around as the focus will be on loadout interfaces and battlefield control from a distance. Next Steps Player journey - building out the framework of the game so players can start at a main menu scene, load into their hub, deploy to the battlefield, and return to the hub. Building the basic functionality for the hub - this is where players can select their drone cores from their library, (or maybe even augment their drone core for additional player agency), select and review a mission map, and repair their mech from previous battle damage. A stretch goal here would be to build out the gesture-based mech repair system Map overview iteration - am hoping to build a simple pick-up-and-place-the-mech type system on the battlefield map to move the mech around larger distances, but maintaining good player visibility for enemy targeting. Inspiration IP - have always enjoyed mech-type IP. My early days of gaming were inclusive of running around in a timberwolf. Input - I did a lot of research in motion-controller interactions when I built MageWorks (VR game on Quest), and targeting systems when I built BlastPoint (mobile AR game). Being able to take the learnings to the next step with gestures is part of my motivation for this project. Gameplay - was looking to combine pet-based combat/player agency (e.g. as exhibited in WoW for pet-based players classes like the hunter and warlock) with a hybrid strategic/tactical style of play much like old-school games such as final fantasy advance tactics. (that's where i want to try moving the mech around to different quadrants/regions as a form of strategy to both keep it alive, and giving players a sense of target priority at both a macro and micro scale. Credits While i'm working on this project as a solo dev, i'm using the marketplace for art, sound, and animation assets as placeholders while i focus on scripting the game in UE5. Many thanks to the marketplace community, forum contributors, and overall XR community for the many resources they have made available over the years. If you made it this far, thanks for reading, and feel free to reach out with any questions or feedback. Once I get a build up and running in a pre-alpha channel on the Quest app store, am happy to add folks to the build for testing. In the meantime, will try and keep this thread updated as the game progresses.
147Views0likes8CommentsStringscape: Turning Hand Distance into Pitch
I’m currently building a Quest experience called Stringscape, and I wanted to share the core idea and get feedback from other developers here. The concept is simple: You stretch a glowing “string” between your two hands, and the world-space distance between them controls the pitch. Closer hands → higher pitch Farther apart → lower pitch The experience is designed to be more of a creative playground than a structured music tool. I’d love to hear your thoughts. It’s currently in Early Access on Quest as well if anyone is curious to try it. Thanks!
21Views0likes0CommentsMixed Reality with Unity and Meta SDK Test
Hi, I have been developing in Meta Horizon since 2020 and have learned UnityXR/MR. I will graduate with a masters degree in Art and Technology in May 2026. For my final project I will be working on a Mixed Reality interaction for dyslexic learners with hand tracking. I will be applying for the smart glasses grant for accessibility. I've been in education for the past 19 years, teaching students with dyslexia for the past ten years. This video shows my first test. Link and image below. Mixed Reality Test, Quest 3: Mixed Reality Test, Unity and Meta SDK by Tina Wheeler34Views3likes0CommentsRecreating Meta’s new AR glasses on my Quest
Hey guys, here’s a cool project I did last week that I wanted to share : Recreating Meta’s new AR glasses on my Quest. 😎 This project reproduces the new wristband using microgestures to navigate through the UI. I also built my own hand tracking implementation for the pinch and twist mechanism, which controls the volume of the audio player and the zoom of the camera just like in the keynote. But my favourite addition is definitely the contextual AI that lets me send whatever I’m looking at to an AI and instantly get more information. This was pretty fun to do, but also helped me think about how future experiences could be designed for this new device! https://www.linkedin.com/feed/update/urn:li:activity:7377329573287391232/ https://www.linkedin.com/feed/update/urn:li:activity:7376702804146429953/131Views3likes2Comments