07-23-2016 02:13 PM
Alright, this has been a week of technical difficulties, which is fine, that's how you know you're making stuff!
Basically I've been trying to mock up a room prototype that is *exceedingly* simple by design, and still issues. Right this moment Unity is, well, nope, looks like Unity is actually frozen...sigh. Anyway, lemme just tell you where I'm at since I can't actually show it.
After a lot of fussing over what I was going to make, and a lot of anxiety about feeling like every project idea was sort of shoehorning traditional game mechanics into VR space, I started to really think about what kinds of experiences make sense in an immersive environment with a fixed rotation.
I kept thinking that gamemakers have an advantage in VR because we are used to designing for a 3D environment, where the user has agency over the viewport. However, we're also used to the player actually being able to move around, rather than having their feet glued to one spot. It's a hard limitation to overcome mentally, but trying to to force that kind of mobility into the Gear just feels clumsy and inelegant to me.
Hold that thought because the EasyMovieTexture developer just emailed me back with a bunch of new files so video textures will work in the Gear and the build just finished and IT WORKS SWEET BABY JESUS THANK THANK YOU THANK YOU THANK YOU.
Ahem, as I was saying. No really, I spent 3 days trying to get that to work and I did not expect anything to come from an email address I found on an untranslated Korean website to work but ASK AND YOU SHALL RECEIVE. Receive sweet sweet code that makes your dreams a reality. Still gotta test it in my project but this looks promising for Monday's prototype!
So anyway, designing experiences around a fixed-axis rotation. This was my inspiration:
Those garden sprinklers we'd run through in the summer, security cameras, swivel chairs, that one scene from The Martian where he sets up the hexadecimal system to communicate...I'm thinking surveillance, communication, ways to extend your senses and collect information beyond your physical position, monitors as extra eyes. Boom, I know a game like this, it's Five Nights at Freddy's.
I'm not the first person to notice this game would be amazing in VR, and I wonder why there aren't more games at least using it as a model. The interface, the atmosphere and the simple gameplay loop are perfect for the Gear. I can't say I know much about CPU and GPU processing at the moment, but it does seem like these pre-rendered animations and static images would also be pretty cost-effective resource wise.
Another thing is that even though it's played with a controller/keyboard, that additional peripheral can be totally eliminated. Did I mention how much I dislike the idea of requiring a controller for a Gear experience? It makes sense for the HTC Vive, where you can effectively see your hands and other things in the environment. It's even er, *somewhat* passable with the wired versions of the Oculus, where you can never wander too far from a desk. But for the Gear? No. Just imagining the flow from putting on a headset where you can't see anything, and having to hand a user a controller, or worse, them having to feel around for it - honestly just thinking about it irritates me. Controllers only if absolutely necessary, but best avoided in my mind.
So what does this all mean for me? Well, personally I think this is the way to go in creating any kind of Gear experience at the moment. I'm mocking up a simple room using the elements 0f FNaF as a model and seeing where I get from there. Skinning and narrative design, even art direction is my strength. I can craft a compelling theme around a set of parts, so I'm gonna build out this prototype, see if it feels right, then see what kinds of experiences arise. Starting with updating these scripts so my movieTextures will build to the Gear! 🙂
Here's a link to the post on my blog where the images aren't as weird07-26-2016 11:37 AM