Forum Discussion
JimT
11 years agoHonored Guest
The Pointman Avatar Control
Hi,
I’d like to make this community aware of a very capable, yet low cost user interface for increasing the user’s control over their avatar in ‘first-person’ & ‘tactical’ shooter games. It’s called “Pointman”.
The Immersive Simulation Lab (ISL), at the U.S. Naval Research Laboratory, developed Pointman over the last decade as a user interface for dismounted infantry simulation, at the squad level.
Pointman is a seated interface designed to provide a high level of control over the user’s avatar by engaging the user’s head, hands, and feet.
Pointman was originally intended for use with an HMD, however, we have been using head-coupled control over the view & aim with a fixed-screen desktop display until a viable high fidelity, low cost HMD became available. I believe the Oculus Rift DK2 may be the HMD we have been waiting for. I received a demo last week and was impressed by the significant progress made in its display & tracking capabilities.
Please take a look at the Pointman Wikipedia page for further explanation of how it works:
http://en.wikipedia.org/wiki/Pointman_(user_interface)
There is also a YouTube video showing how it works:
https://www.youtube.com/watch?v=rsEGcJummEw
Unlike the majority of simulation-based avatar controllers that use canned animations to control the avatar, Pointman provides continuous positional control over twelve degrees-of-freedom of the avatar’s posture. It employs a 6-dof positional head tracker, a dual-stick gamepad and sliding foot pedals to provide a balanced and natural assignment of control. The user moves his head and upper body to control looking and aiming, and leaning to duck and peek around cover; his hands to operate virtual weapons and direct movement; and his feet for stepping and controlling his avatar’s postural height.
To date, Pointman has only been fully integrated into the Virtual Battlespace-3 (VBS3) combined arms simulator from Bohemia Interactive Simulations, the system currently used by the Marine Corps’ Deployable Virtual Training Environment. We are currently working in-house to integrate it with the Unity game engine to support our ongoing research in expressive interaction.
------
Before developing Pointman, ISL developed a full-body tracked user interface called “Gaiter”, in which the user wields a rifle prop and walks in place to walk through the virtual world. Although walking in place works fine for leisurely touring a building, it does not support the wide range of tactical movements required for combat simulation very well. Keeping the virtual rifle (seen in the HMD) aligned precisely enough with the physical rifle well to avoid negative training also remains an unsolved problem.
After this learning experience we shifted our goal from trying to achieve physical realism to seeking behavioral realism. The goal became to allow the user to make his avatar to react in the virtual world, as he would have physically moved his body in the corresponding real world situation.
-JimT
I’d like to make this community aware of a very capable, yet low cost user interface for increasing the user’s control over their avatar in ‘first-person’ & ‘tactical’ shooter games. It’s called “Pointman”.
The Immersive Simulation Lab (ISL), at the U.S. Naval Research Laboratory, developed Pointman over the last decade as a user interface for dismounted infantry simulation, at the squad level.
Pointman is a seated interface designed to provide a high level of control over the user’s avatar by engaging the user’s head, hands, and feet.
Pointman was originally intended for use with an HMD, however, we have been using head-coupled control over the view & aim with a fixed-screen desktop display until a viable high fidelity, low cost HMD became available. I believe the Oculus Rift DK2 may be the HMD we have been waiting for. I received a demo last week and was impressed by the significant progress made in its display & tracking capabilities.
Please take a look at the Pointman Wikipedia page for further explanation of how it works:
http://en.wikipedia.org/wiki/Pointman_(user_interface)
There is also a YouTube video showing how it works:
https://www.youtube.com/watch?v=rsEGcJummEw
Unlike the majority of simulation-based avatar controllers that use canned animations to control the avatar, Pointman provides continuous positional control over twelve degrees-of-freedom of the avatar’s posture. It employs a 6-dof positional head tracker, a dual-stick gamepad and sliding foot pedals to provide a balanced and natural assignment of control. The user moves his head and upper body to control looking and aiming, and leaning to duck and peek around cover; his hands to operate virtual weapons and direct movement; and his feet for stepping and controlling his avatar’s postural height.
To date, Pointman has only been fully integrated into the Virtual Battlespace-3 (VBS3) combined arms simulator from Bohemia Interactive Simulations, the system currently used by the Marine Corps’ Deployable Virtual Training Environment. We are currently working in-house to integrate it with the Unity game engine to support our ongoing research in expressive interaction.
------
Before developing Pointman, ISL developed a full-body tracked user interface called “Gaiter”, in which the user wields a rifle prop and walks in place to walk through the virtual world. Although walking in place works fine for leisurely touring a building, it does not support the wide range of tactical movements required for combat simulation very well. Keeping the virtual rifle (seen in the HMD) aligned precisely enough with the physical rifle well to avoid negative training also remains an unsolved problem.
After this learning experience we shifted our goal from trying to achieve physical realism to seeking behavioral realism. The goal became to allow the user to make his avatar to react in the virtual world, as he would have physically moved his body in the corresponding real world situation.
-JimT
8 Replies
- FredzExplorerI didn't know about this one, thanks for posting. I've been maintaining a list of locomotion devices for virtual reality here, I added it in the stepping section.
- Anonymous
"Fredz" wrote:
I didn't know about this one, thanks for posting. I've been maintaining a list of locomotion devices for virtual reality here, I added it in the stepping section.
Cool wiki! Just a heads-up: the first YouTube link in the Viiwok page actually leads to a Cyberith Virtualizer video. - FredzExplorerCorrected, thanks. I've put it in the rolling section also since it's been known to use ball bearings.
- mptpExplorerThis is pretty cool!
I feel like with some adaptations it would be great for VR - notably the gait would need to be smoothed out, since I imagine the stop-start nature of the movement as seen in the video would lead to sim sickness for many users. Also, somehow it needs to be made possible to turn using the feet only, since the whole point of locomotion controls (in my opinion) is that they free the hands for natural input.
I'll add this to my own (far less extensive than Fredz') list of locomotion devices on this forum. :) - JimTHonored Guest
"mptp" wrote:
I feel like with some adaptations it would be great for VR - notably the gait would need to be smoothed out, since I imagine the stop-start nature of the movement as seen in the video would lead to sim sickness for many users.
We are currently filtering the motion at two levels; granted there is (always) room for improvement. You should have seen how bad the optic flow was when the pelvic motion was tied directly to the pedals’ motion (scissor walking)! It’s desirable to have a certain amount of head sway to make it feel natural.
A strength of the current design is that the user can stop fairly quickly, with his feet apart by any degree. It’s a model-based motion system, driven moment-by-moment by the user’s actions, rather than a predefined animation. We even-up the motion of the pelvis by flexing the legs. Supporting continuous variations in course & postural height adds complexity. I’d like to make the leg motion more graceful, but there are other things to work on."mptp" wrote:
Also, somehow it needs to be made possible to turn using the feet only, since the whole point of locomotion controls (in my opinion) is that they free the hands for natural input.
Don’t underestimate the value of positional control over locomotion. Using a rate-based control is like moving through the world on a Segway. Unfortunately, conventional game control APIs only support rate controlled motion (speed versus displacement).
Before we developed Pointman, we developed a very effective seated locomotion control that is entirely foot based. We call it ‘Strider’: http://www.google.com/patents/US7542040
The gestural motions of stroking and swinging the feet include translational and rotating movements chosen to correspond to the actions of the feet and legs during natural locomotion. If you imagine a large sheet of paper under your feet that you can slide or turn about by moving it with your feet, and with your avatar moving at twice the displacement of your physical body relative to the sheet - you’ll get the idea. Unfortunately, a number of people found it uncomfortable to use because their ankles were not flexible enough.
Pointman was designed to be easy and effective to use from the start, because Marines do not have a lot of time to train to use a new simulation technology. It serves as a transitional user interface. Most Marines do not like the foot pedals until they start to realize the additional control it gives them (in 15 minutes to an hour). Adding too many foot-based control actions at once would not have been accepted.
If you’re willing to sacrifice control over postural height (crouching & high prone) then you could apply the up down motion of the pedals for turning. If the pedals slid independently, could twist, move laterally, or sense additional directional forces, you could apply these extra DoFs to turning. The trick is to keep it simple & ‘natural’.
Marines need to move ‘tactically’. We concluded that this means that they need to independently control their heading and course. Marines usually hold a weapon in their hands while moving tactically, so the requirement for hands-free operation was less of an issues than ease of use. Thus we opted to use a familiar set of gamepad thumbsticks rather than increasing the complexity of the foot controls.
-JimT - mptpExplorerHey Strider is pretty much exactly the idea I've been thinking of for the past little while for locomotion control in VR! I'm most distressed that people found it hard to use - but wouldn't you just map the ankle rotations performed as gestural input to degrees of body rotation over the duration of that gait? I can't imagine it would require any more flexibility than ordinary walking. :S
And yeah, I don't doubt that Pointman was/is ideal for its current application, that was just some musings on how it could be improved to be better suited to consumer VR. :) - JimTHonored Guest
"mptp" wrote:
Hey Strider is pretty much exactly the idea I've been thinking of for the past little while for locomotion control in VR!
It’s a really elegant approach. It remains my personal favorite, but I have flexible ankles."mptp" wrote:
I'm most distressed that people found it hard to use - but wouldn't you just map the ankle rotations performed as gestural input to degrees of body rotation over the duration of that gait? I can't imagine it would require any more flexibility than ordinary walking.
Not “over the duration of that gait” but moment-by-moment.
– That’s the beauty of positional controls.
The problem stems from having your legs bent when seated on a chair. When you stand & turn of your pelvis w.r.t. the support foot, the twist is distributed all through your leg. When you knee flexes at around 90 degrees it constrains how far your foot can turn about the axis of the lower leg. (It reminds me of how you set up joint locks in some Aikido techniques.)
Now you could scale the user’s ankle rotations, but the people with stiff ankles don’t seem to have great fine motor control over turning their feet. The tradeoffs associated with adjusting the ‘Control-to-Display Ratio’ have to be reckoned with.
As it turns out, some people have difficultly merely flexing their foot more (and less) than 90 degrees away from the lower leg, and they have trouble simply stroking the floor. In modern society, people who are not engaged with specialized sports don’t need to move their ankles very much. If only we could get them using an interface that encouraged a lot of foot agility? – But that’s a chicken & egg proposition.
-JimT - Shannonb1Explorerperhaps a platter with a pot or hall sensor under the the base would allow for twisting lightly to adjust your facing direction. This would be somewhat immersive as your feet would control the body facing direction.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 7 months ago
- 2 months ago
- 17 days ago
- 9 months ago