cancel
Showing results for 
Search instead for 
Did you mean: 

Got any VR Dev Tips?

ezone
Protege
Now that the jam submissions are complete I thought it would be interesting to see what tips and tricks everyone has for VR game design.

For me the most important thing was to find ways to reduce motion sickness. I found that a low framerate was the #1 cause of sickness for me. In some of our early builds we had a bunch of expensive camera effects like bloom, etc which dropped the framerate to < 50 fps and motion sickness effects were noticeable. As soon as they were removed the framerate got back above 60 fps and all was good.

The other main contributing factor for me is the disconnect between the motion you are seeing and what you are (or aren't) feeling. I still have a problem with games that use key or gamepad controls for movement (rather than head tracking). I guess that's why my favourite games in the jam so far are EpicDragon, Don't Let Go, and Dumpy: Going Elephants.

Anyway, that's my two cents worth - I'm really interested to hear what tips the rest of you have.
Simon Edis - Ezone.com
8 REPLIES 8

Freyelise
Honored Guest
I agree with the framerate observation. I don't have a rift myself yet, but the friend who we made the game with was very fussy about framerate when he tested it with the rift. This would suggest that going for smoothness over fidelity would be the winning strategy.

Another thing that I as a sound designer thought, was the use of sound as cues for the player. If you want a minimal HUD or none at all, make sure you have telltale sounds associated with key mechanics. (Preferably sounds that aren't tiring to the ears even if repeated over and over.) Like have a slightly different sound for when you fire the last bullet in your weapon. I remember playing Day of Defeat way back when, and the "ping" of the last shot in a garand rifle was such a tell on the battlefield (for better or worse). In Windlands we have two sounds for the wind in your ears as you move faster - one whoosh for safe speed and a more resonant howl for when you reach a fall speed that would kill you.

That's probably obvious for any game, not just VR, but I thought the added immersion makes all this that much more important. (and maybe I'm a sound designer who likes to toot his own horn once in a while :twisted: )
Sound guy for Windlands Listen and download my (CC-BY) music at http://bureauofbrokensounds.com/

psykomike
Honored Guest

  • As long as you don't have a treadmill and low latency 6DoF head tracking, don't let your character run around. It just feels wrong, when you are sitting in a chair pushing an analog stick and your character is speeding around on rails with running noises.

  • Modern day shooters fit perfectly to a standard monitor. You are focussing on cross hairs in the middle of the screen. You don't need 110 deg viewing angle. You could do with 2 degs, when you are sniping. With the Development Rift you have a resolution of approx. 400x300 in the typical 1st person shooter viewing angle (measured in Team Fortress). We need a Quad HD Rift to play the games made for XBOX 360/PS3 nicely. Adjust your project accordingly.

  • 3rd person games should generally translate very well to the Rift. I had my first good Rift shooting experience with the AngryBots project. It's a 3rd person twin stick shooter going 1st person for the Rift.

  • A curved menu screen is really nice with the head tracking. You can't see the corners of a flat menu when it's too big, and it has to be big, because the resolution is lacking. Some projects tried multiple screens, which is equally fine (matter of taste/programming-/modelling time). But don't let me turn my head 90 degs to look at the other menu plane, that's too far!

  • Show your own character in 1st person, it's nice for the illusion. Best is showing the character using the player's input method (joypad, keyboard). And don't forget the neck when you're rotating the Rift's cameras.

  • If you use a big rgb full saturation red/green/blue color area for your game, you can clearly see the subpixels. If you use a photorealistic setting with the right amoung of detail, it can already mask a lot of the Rift's shortcomings.

  • GPU scaling is the best AA for the Rift. Give it 1920x1200 and 8x MSAA and the picture gets really smooth. You can't see the individual pixels anymore. I think it also reduces the headache i'm getting from the Rift more than >60 fps framerate, but scientific evidence is scarce. Blurry is good enough for me, i'm short sighted and trained with blurriness. Also tried SSAA, but it produced more artifacts than GPU scaling on my Radeon.

  • Implement Magnetometer Correction. It's very important when you play longer than a few minutes. We're also resetting the Rift's orientation in the change from 1st to 3rd person and back, which helps a little.

  • Geometric detail is really nice in 3d. My favorite 3d experience is still the Panasonic 3d demo disc. It has dirt bike in the mud and dew on grass. Mhmmmmm

Serk
Explorer
This list is not specific about VR, but more of a post-mortem with the things that I learnt during the VRJam. I'll try to split it up so you can read only the bits you care about (and of course, this is all my personal opinion :D)


On immersion

- Immersion and the feeling of presence is KING with the rift. If you go for immersion, it feels like you are standing on top of a giant's shoulders, you can easily reach heights you wouldn't have dreamt of before.

- However, you still have to do your part. A bad placed UI element, text that is too uncomfortable to read, game objects that are the wrong scale, etc... all can shatter the illusion and remind the player that she is wearing the rift and playing a game, rather than being in a (virtual) world.

- Do not understimate sound. This is true of all immersive games, but specially important in VR. Turning your head with headphones on means the sound you hear in each ear will change accordingly to your orientation, which makes it easier to perceive sounds in 3D and locate their source. That's something you can't experience in a monitor, and if you use it right it will increase the feeling of presence.


On controls

- Try to have as few controls as possible. Try for them to be as intuitive as possible. When inside the rift, the player can't see the gamepad buttons or keys on the keyboard, so you should only use buttons or keys that are within easy reach, and stick to them. "Press M to open the map" does not really work.

- For the same reasons, learning controls mid-game is problematic. The player should ideally know the controls before they enter the actual experience.

- Looking is a control. Head tilting is a control. And they are both analog!


On UI

- 3D elements fully integrated in the scene seems to be the most natural and less jarring experience. That is specially important if you go for immersion. Lighting the 3d elements in the same way the rest of the scene is lit also seems to help make them feel more cohesive.

- If you go for a HUD, try not to make it glued to the player's face. If the HUD goes with your view, it means it will always be on the periphery of your field of view, making it hard to focus on. Consider having the HUD "lag" a bit, or be otherwise loosely coupled to the player's head, so that she can actually focus and read the texts on it.

- If you are going to show a flat UI element that eats up most of the player's vision (loading screen, tutorial text, etc...), DO NOT glue it to the player's head. Allow the player to look away. Consider that element a virtual monitor or screen which is actually inside the virtual world.


On jamming

- Testing is a pain. Little installed base means it's hard to find people to test your game before releasing it (specially when posting a link in a forum like this is the same as releasing it). Plan for this.

- Follow what the game is trying to tell you. Dreadhalls was originally a stealth game, but it mutated on an horror game on its own after I started playing it inside the rift. Had I tried to stick to the original stealth mechanics, it would probably be a poorer game by now.

- Focus on making as much progress as soon as possible. Write down a list, sort it out by their priority and start from the top. Focus on making it work first, polishing it later. Do not spend days on small details while your main mechanics are still undone. In Dreadhalls, I wanted to make a more advanced level generator. It turned out a rough one was quite enough for a 3 weeks prototype.


On procedural horror

- Relying on procedural content is a double-edged sword. It improves replayability (you might know all the monsters and scares in the game, but you don't know when they are going to show up), but it has its drawbacks. Procedural content is more bland than authored content. Procedural also tends to generate edge cases (levels that are too hard, or too easy).

- Using an hybrid approach (procedurally combining authored content, or using authored content as a guideline for the procedural one) seems to work best. Dreadhall's rooms are choosed from a template list, and the object's placements are predetermined. I figured this out too late to make good use of it thorough the full game, though.

- Horror is more about pacing and subverting expectations than "scares". It's the fear of having something jumping at you, rather than the actual something jumping at you, what makes you really afraid. You don't actually need to show something horrible to scare the player, just imply that she is about to see something horrible. The player's mind will probably imagine something far more horrible than what you can show.

- In Dreadhalls, pacing is done procedurally. However, pacing is done on a per-element basis rather than a holistic way. That is a mistake, since it means several different elements can peak their intensity at the same time, stepping on each other's toes. A central "director" that controls all the game element's intensities is probably a good idea.


Thanks for reading this far. Hope it was interesting!

ChrisJD
Honored Guest
"Serk" wrote:
- If you are going to show a flat UI element that eats up most of the player's vision (loading screen, tutorial text, etc...), DO NOT glue it to the player's head. Allow the player to look away. Consider that element a virtual monitor or screen which is actually inside the virtual world.


I agree with that but with one exception.

When a game is first started the "forward" direction in a lot of the entries and demos I have tried (haven't tried Dreadhalls yet) has been the direction the Rift happened to be facing when the game was launched. This bugs me no end because I tend to have my Rift sitting sideways on my desk and I frequently forget to hold it in the direction I'm looking while starting games. Sometimes the way to re-center is obvious, sometimes it isn't and sometimes there is no way to do it. Even if there is a clear instruction right at the start sometimes you have to look around to find where the menu is to see what the re-center button is.

My solution for this in the jam was to have the very first thing that appears be a solid black screen with the centering instructions (I also show a warning here if no magnetic calibration data is found) attached in front of the camera. To continue past that the user follows the simple instruction of "Look forward and press Start to begin". Doing it this way avoids the user having to look around them to find a menu that may not be in front of them.

After that no UI style elements are attached to the camera.

ezone
Protege
"ChrisJD" wrote:
My solution for this in the jam was to have the very first thing that appears be a solid black screen with the centering instructions (I also show a warning here if no magnetic calibration data is found) attached in front of the camera.


Great idea - I'm going to add this to future updates.
Simon Edis - Ezone.com

klyemar
Honored Guest
I found that implementing UI elements in the world space wasn't enough for me, they had to be able to react to the world as well. A three dimensional user interface is great for the player, but the effect is completely ruined when the crosshair that appears to be three feet away from you is now clipping through the wall that is one foot away from you. The same goes for indicators and text, your eyes are not physically built to focus on a far object inside of a near object.

To correct for this, I've found that fading the GUI works well in some cases (this is the case for the default Unity SDK crosshair) and allowing the GUI to stick to physical objects works well in others. The crosshair in Gargantua detects collisions by ray tracing from the camera to a designated point in the world space, and will tell the crosshair to move along with the object that it is colliding with. I hadn't intended for this, but this crosshair behavior also allowed for more accurate interactions with the world, since I was aiming with an object that existed in and was affected by the world around it.

deice
Honored Guest
"psykomike" wrote:

  • GPU scaling is the best AA for the Rift. Give it 1920x1200 and 8x MSAA and the picture gets really smooth. You can't see the individual pixels anymore. I think it also reduces the headache i'm getting from the Rift more than >60 fps framerate, but scientific evidence is scarce. Blurry is good enough for me, i'm short sighted and trained with blurriness. Also tried SSAA, but it produced more artifacts than GPU scaling on my Radeon.

I'll disagree with that, I feel native resolution is always the best, rescaling just loses detail and wastes processing power over rendering and antialiasing at native resolution. But some people seem to prefer the loss of details, so maybe some kind of Blur filter would be beneficial. I think this will solve itself with the higher resolution rift however.

Some technical things I've noticed when playing demos (some were said above, but they're important enough to repeat):

  • FPS is king. FPS lag inducing latency in headtracking is the only thing that can nauseate me in the Rift. Pay extra special attention to it.

  • Don't glue the UI to the users face, since this makes only part of the UI accurately visible. In world 3D UIs seem to work best.

  • In 3D, turn the walking directions towards where the user is looking. If the walking forward direction is controlled separately, it feels very strange, like your body is a Mech that does not walk where the head is facing (and you have no way to know which way the body is facing).

  • Automatically launch the game on the RiftDK monitor. Don't make me have to figure out obscure parameters / drag and alt-enter it to get it to start on the rift. (I don't do cloning because that's a stopgap measure).

  • Lock mouse cursor and controls to the application while in the Rift. Many applications allow you to click outside the game, and its very jarring to suddenly click outside the Rift and be tossed back to desktop.

attila
Honored Guest
I have some ideas here 🙂

How about play Matrix, but in real!
Or how about play a movie instead of just watching.. being part of the story, change the store on the fly..

A few weeks ago I had a very interesting and challenging work. I combined our tool -full body motion-tracking system, MVN- with Oculus Rift in Unity3d.

The result is amazing! We actually made Matrix to reality.. (ok, it was always true, but now anyone can try it).
We streamed 2 person into 1 vr game and let them play... no motion-sickness or any sign of disorientation after hours of playing. The actors forgets where they are, which you can notice when they remove the Oculus.. 🙂
Our solutions allows streaming 4 people at the same time to the same game. It is possible to have a real operator who can change the game in real-time, like load scene, change weather or add weapons 🙂

I'm sure, any game developer looking this video will got a new hype about ideas for future games!

http://www.youtube.com/watch?v=LtMfrkRqlRs


PS: sorry for post this link again, but hard to find the right place to post, after not being able to create a new topic