Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
experience's avatar
experience
Honored Guest
11 years ago

Locomotion and VR - let's talk about control schemes

There are some promising technologies on the horizon which aim at brining physical motion into the VR world, however, unless Oculus announces their own product, we can reasonably estimate that those will be something of a niche market. That leaves us with the question of how to best control movement and interaction in a VR environment. This subject has been brought up numerous times, but I haven't seen it develop into final conclusions and standard recommendations. If I've missed that somewhere, please stop reading and send me in the right direction :)

I'd like to discuss control schemes for what IMHO will be the primary (based on quantity) input devices: Handheld Controller (Xbox etc.), and Mouse/Keyboard.
I'd like to discuss what works, what doesn't, what the factors are, and if the best options can be incorporated into an API and updated in the Best Practices Guide. The Best Practices Guide seems to currently cover the theory very well, but doesn't go as far as direct implementation. I think we could benefit from more standardization, not necessarily enforced, but in the least as a reference. I think collectively getting this right will be an important step in promoting VR among consumers, which is why I'm writing this as my first forum post.

A few points:
1. Oculus is targeting a seated experience
2. Oculus has stated that mouse/keyboard is not ideal. It is, however, the primary input method for the majority of PC games.
3. Oculus uses a handheld controller for its demos. This is IMHO the best easy to adopt control method (cheap and simple).
4. Games should attempt to optimize both experiences.
5. Getting control wrong has physical side effects due to mismatches with the vestibular system. This subject goes far beyond input controls, but for the sake of this topic I'm focusing purely on input. Bad controls literally make me sick ;-)

I'll start by contributing what I think would theoretically be a great basis for a control scheme on mouse+keyboard in a first person experience (this can easily be adapted to controllers, so I'll just give the single example rather than being redundant). I have not yet used this; I'm still at the discuss/share/theorize stage. One key thing about this is that it mimics many of the ways we interact in real life, translated into discrete categories. I think there are benefits to discretely linking each of these input devices to specific areas of control:
Keyboard = Movement
Mouse = Hands and Interaction
Rift=Head/Perspective


Keyboard - Movement
Standard WASD/Arrow controls, but replace strafe with turn. Perhaps with a configurable "turn speed". Strafing makes me uncomfortable in VR. Doing things in VR that are unnatural IRL (in real life) often accentuates vestibular mismatches (e.g, makes me nauseous). I don't know if turning at a pre-programmed rate will reduce sickness, but I theorize that the predictability of it will help, along with the instant acceleration/deceleration and constant speed of movement. More importantly, I don't like using the mouse to control the camera in any way, and using the arrows to steer allows the mouse to function purely for interaction within the view.

Mouse - Interaction
Controls cursor/crosshair only; no affect on camera/perspective.
First off, every control scheme that allows me to control the camera using the mouse has made me sick, no matter if it is limited to a certain axis or threshold. I'm sure this will improve with updated hardware, but I still think vestibular "confusion" will always be there to some degree if you can jerk the camera view.

I like the idea of having the cursor limited to the bounds of the view, but see a few ways that it could be affected by the tracking:
1. Cursor maintains screen position, ignoring tracking. This may be odd when the cursor appears to fly as you look around.
OR
2. Cursor is relative to the in game avatar, until you hit the boundary of the screen/perspective. For example, if you leave the cursor pointed at an object, then look left away from the object, the cursor will remain pointed at the object until it hits the boundary of your screen. I like this approach, as I feel the cursor reflects our real life hands, and this comes closer to mimicking how we look around and interact in real life.

To deal with depth mismatches (depth of cursor is greater than object in foreground for example), perhaps the "cursor" needs to be adapted to function more like a laser pointer, where depth/collision are factored in to the cursor design. The design/shape/transparency/etc., and whether or not you even need a cursor/crosshair will depend on the game. The keyboard would still control interactions that aren't cursor based, such as, "Press E to enter".

Positional Tracking
The way I would like to have positional tracking affect locomotion is a hybrid approach. Looking to the sides would not affect the direction of movement until you hit a certain threshold. Once you look far enough to hit this threshold, your in game movement (and avatar) will re-center to your perspective. There should be some point of reference to your avatar so that you always know what direction you will move in, and so that you know where the threshold is.
Or, no threshold; tracking affects perspective and does not affect locomotion. I think this is very feasible as long as you know where your in game avatar is pointed. The limitation here is that if you turn all the way around and your avatar is facing the other way, at a certain point I think movement and interaction will become strange. Again, these choices depend on the game.

I tried to keep that concise, but there are many points to make. I would happily test iterations of control schemes, but have limited experience with modifying this sort of programming. I'll see what I can throw together if nobody else takes or has taken a crack at this.
Please share your own thoughts and input on input :)

6 Replies

  • drash's avatar
    drash
    Heroic Explorer
    I hadn't really thought how the upcoming positional tracking can be used to influence locomotion! I wonder if that will be comfortable to use positional tracking to strafe. I'm looking forward to some interesting upperbody workouts in my future.

    For existing examples of options and controls schemes as it relates to managing your view and locomotion, I would first refer you to these three:


    • Time Rifters demo (v0.5) (hybrid turning and forward direction, visible forward reference)

    • Minecrift (keyhole, control options galore)

    • The Gallery: Six Elements (although I'm not sure that the latest advancements in comfortable locomotion are available for anyone to play)


    And I would also like to mention that despite the recommendation in the Best Practices Guide, I've met people that prefer smoothly accelerated motion/turning and some that prefer instantly accelerated motion/turning, and so I think that should also be a player preference.
  • "drash" wrote:
    I hadn't really thought how the upcoming positional tracking can be used to influence locomotion! I wonder if that will be comfortable to use positional tracking to strafe. I'm looking forward to some interesting upperbody workouts in my future.

    For existing examples of options and controls schemes as it relates to managing your view and locomotion, I would first refer you to these three:


    • Time Rifters demo (v0.5) (hybrid turning and forward direction, visible forward reference)

    • Minecrift (keyhole, control options galore)

    • The Gallery: Six Elements (although I'm not sure that the latest advancements in comfortable locomotion are available for anyone to play)


    One interesting thing I noticed about the DK1, is that as you get used to the various things that disconnect you from presence, you can somewhat adapt to the point that they no longer break presence as much. In that way, I've come to think of presence as something that can be quantified, with beneficial and detrimental influences. Things that massively break presence tend to also cause sickness. Improvements to the hardware will mean you can overcome the negative influences by force, but it may also hide what could be best practices.

    I'll have to give the second two a spin. I like Time Rifters a lot, but it gave me the strongest simulator sickness I've experienced. In fact, it made me so sick that it helped motivate me to start looking into this with greater focus.

    By the way, Titans of Space is probably my favorite and the most comfortable experience I've had in the Rift to date. I'll be following your work closely!
  • "drash" wrote:
    I'm looking forward to some interesting upperbody workouts in my future.


    I'm expecting significant strength increases in my neck :)
    But seriously, if someone makes content with entertaining exercise, I'd happily spend more time in the Rift and less at the gym.
  • I believe in my theory, that a standard 360 pad will be fine or a stick motion controller, In my opinion walking can be simulated using a inverted pendulum technique and haptic feedback to both feet. This can be achieved bye shifting your centre of balance in relation to the direction your travelling within the VR space. I've been delving into the possibility of greeting a device which works on the principles of a balance ball bot to create a feeling of movement under your feet and the tactile feedback system for each step taken. The tactile feed back can be created by two tactile transducers and this would split the sound of the footsteps in software, check out Sim-vibe - http://simxperience.com/Products/SimVibe/SimVibeSoftware.aspx also check out one of my video's in which I'm using a early prototype which has started to lead me to this conclusion. http://www.youtube.com/watch?v=mJKqRDRO4KI&list=UU0ocR3OgDUZwIDspfh62YeA
  • "PlasmaQuark" wrote:
    I believe in my theory, that a standard 360 pad will be fine or a stick motion controller, In my opinion walking can be simulated using a inverted pendulum technique and haptic feedback to both feet. This can be achieved bye shifting your centre of balance in relation to the direction your travelling within the VR space. I've been delving into the possibility of greeting a device which works on the principles of a balance ball bot to create a feeling of movement under your feet and the tactile feedback system for each step taken. The tactile feed back can be created by two tactile transducers and this would split the sound of the footsteps in software, check out Sim-vibe - http://simxperience.com/Products/SimVibe/SimVibeSoftware.aspx also check out one of my video's in which I'm using a early prototype which has started to lead me to this conclusion. http://www.youtube.com/watch?v=mJKqRDRO4KI&list=UU0ocR3OgDUZwIDspfh62YeA


    Very cool, looks to me like this could have a very natural feel to it. I had considered using positional tracking on leaning as a directional control method, but this takes it a step farther.
  • Update - my Rift has suffered death by coffee. I'm contemplating buying another while I wait for my DK2, but can't test anything at the moment. I may have to buy another one; I'm hesitant to develop for months without proper testing.
    In the meantime, I'm interested in how controls progress, but can no longer provide my own hands on feedback.