Snowball Fight is the second major VR experiment coming from Verge of Brilliance.
Backstory I started a snowball fight game last year at a Facebook Hackathon. At the time I wanted to make an asymmetric multiplayer experience where controller players could play with VR players. I met up with some awesome game devs at the hackathon, some that I still consult with from time to time so I'd say that the hackathon was a success. But wow, there were some basic questions that I needed to get out of the way before I can even touch asymmetry in the gameplay design. These are questions that are still for the most part unanswered in VR. Questions like "how does one move past the boundaries of the play space?" or "How does the player make snowballs?" or "How does the VR player receive communication from the game that they've been hit?"
These are just the beginning of the questions I needed to answer for my design. So the hackathon design goal for now is on the backburner. For now I just need to answer the question of how to make this experience authentic and engaging as a single player experience.
UI / UX Design Discussion The physics / controls are one part that I feel are very promising. I started out with a nice physics interaction system made by Tomorrow Today Labs: Newton VR. You can use it under Creative Commons license and is available for Unity developers. I truly do love this system and pretty much start here whenever I want to quickly prototype some physics based interactions in Unity. I suggest reading Nick Abels blog about Physics in VR if you are curious why I'm not just slapping Unity's built-in physics solutions on my controllers and calling it a day. Pick your battles people.
Anyway, from here there's still a lot of tuning in order to get the interaction design right. I usually tend to try and communicate as much as I can about the controls without any text. I've still got a ways to go, but I'm getting there. I wanted it to feel like you're really playing in snow. In the final version I want people building with, tossing, and experimenting with the snow much like one would during the first major snowfall of the year. Ambitious and compelling yes, easy to communicate no.
Consideration - What button do people usually press to do stuff? The answer is almost always the trigger (on HTC Vive). People want to use it for nearly everything. There are some grip controls on the Vive controllers as well but I can't figure out a good time to use them. People barely recognize when they are there. Furthermore, players tend to fatigue themselves by squeezing the grip controls to hard. I recommend against using those controls for any common interactions in your game.
Sticking point 1 - When is the player touching the ground? How does the player know? So far I have little hand images that appear on the ground when a player is touching the ground. I used to make the players actually reach down and touch it with their controllers, but the light house positional tracking isnt so great all the way down there. Plus causes more fatigue for minimal increase in presence. So now I'm allowing people to only have to bend down a bit to start scooping.
Sticking point2 - different size snowballs means more variation and less players knowing what they are doing when they throw the ball. Probably my biggest problem at this point. right now bigger snowballs have more mass and there for require a faster swing in order to throw at a decent speed. This is hard for players to grasp because to them the weight of the controller stays exactly the same. Consequentially this makes aiming for players nearly impossible without help. I'm currently helping players aim with a cross hair in the center of the view. When players line up the crosshair with a advasary, the target turns blue and the snowball goes into homing mode.
Welp. Thats enough for now. More on this project later! Thanks for viewing!