Forum Discussion
StuPer
12 years agoHonored Guest
The movement conundrum
So I've been watching the recent Oculus videos from GDC etc. and would like to post a couple of thoughts/observations here around a couple of the issues highlighted.
Creating a realistic sense of movement when moving around in VR.
I hope I'm not teaching anyone to suck eggs when I say this but my thoughts would be to look at the biomechanics of the human body then apply them to in-game locomotion with some tweaking.
What do I mean?, well when the human body is in motion as we all know the head isn't static it rolls slightly, it moves from side to side and it goes up and down. The variation of this will be dependent on the individuals speed, height, posture and to a lesser extent weight. Would it not be possible using say a treadmill to track these movements at various speeds and in transition from one speed to another (walking to running etc.) then map them into a movement mechanic?. (just re-read this, I'm not suggesting each user has to get this data individually, but a generic data set be created of an 'average' human that would be then changed to suit an individual users measurements)
Now that in itself I don't think would be enough, as was pointed out in the talks the human body has tricks that allows it to maintain steady vision even when moving in many vectors at once (though I believe it is much better at horizontal and vertical correction then lateral). I think the key here is that even with these tricks the body does not maintain a totally rock solid visual field, it applies the dampeners in all states but with various levels of success. So my proposition would be to create the mechanical model of movement as described above but then apply an artificial dampener that simulates our human reactions.
Done incorrectly I think this would break immersion not to say presence, but I think if the balance was right and the sweet spot found it would 'lock us in' to the experience exponentially better compared to current attempts. In order to implement this across multiple experiences I think it would require a person to enter the relevant data at the same time as they are doing their IPD, so developers can call on the data once instead of each experience requiring user configuration.
After all the reason the current seated experiences work so well is because the brain is not expecting any other movement than head/torso, allowing us to easier suspend our disbelief.
Thoughts anyone?
The other issue that I had thoughts around was the idea that your body will not accept movement cues in VR if it can sense it is sitting down, bum in chair and feet on ground etc.
I have an injured spine, due to training and physiotherapy it is largely under control, but when it wasn't I used to have a recliner in front of my computer, I tilted the monitor down a little and used the keyboard on my lap with the mouse on a small table at the side of the chair. In this position, which was almost supine, I had little pain, and my weight was distributed through the length of my body rather than vertically down my spine. The point being would it be possible to engage in VR in a supine position using the movement mechanic I mentioned at the start and be able to disengage the lizard brain's monitoring enough to again suspend disbelief.
Thoughts?
I'm just putting this out there and chose the General Development forum as you guys are the ones who likely have answers or could figure out a way to achieve it.
I realise on one hand I'm suggesting people need recliners to enjoy a god VR immersion, but we are at a stage where we haven't cracked this movement conundrum so anything is worth mentioning I figure.
Lastly a question re: Dev kit 2, does anyone know that if you place the camera at slight height above your head (a foot or more) if this would aid in getting near to 360 degrees positional tracking? Or would it throw the positional tracking off if the camera is not at eye level?
Regards
StuPer
Creating a realistic sense of movement when moving around in VR.
I hope I'm not teaching anyone to suck eggs when I say this but my thoughts would be to look at the biomechanics of the human body then apply them to in-game locomotion with some tweaking.
What do I mean?, well when the human body is in motion as we all know the head isn't static it rolls slightly, it moves from side to side and it goes up and down. The variation of this will be dependent on the individuals speed, height, posture and to a lesser extent weight. Would it not be possible using say a treadmill to track these movements at various speeds and in transition from one speed to another (walking to running etc.) then map them into a movement mechanic?. (just re-read this, I'm not suggesting each user has to get this data individually, but a generic data set be created of an 'average' human that would be then changed to suit an individual users measurements)
Now that in itself I don't think would be enough, as was pointed out in the talks the human body has tricks that allows it to maintain steady vision even when moving in many vectors at once (though I believe it is much better at horizontal and vertical correction then lateral). I think the key here is that even with these tricks the body does not maintain a totally rock solid visual field, it applies the dampeners in all states but with various levels of success. So my proposition would be to create the mechanical model of movement as described above but then apply an artificial dampener that simulates our human reactions.
Done incorrectly I think this would break immersion not to say presence, but I think if the balance was right and the sweet spot found it would 'lock us in' to the experience exponentially better compared to current attempts. In order to implement this across multiple experiences I think it would require a person to enter the relevant data at the same time as they are doing their IPD, so developers can call on the data once instead of each experience requiring user configuration.
After all the reason the current seated experiences work so well is because the brain is not expecting any other movement than head/torso, allowing us to easier suspend our disbelief.
Thoughts anyone?
The other issue that I had thoughts around was the idea that your body will not accept movement cues in VR if it can sense it is sitting down, bum in chair and feet on ground etc.
I have an injured spine, due to training and physiotherapy it is largely under control, but when it wasn't I used to have a recliner in front of my computer, I tilted the monitor down a little and used the keyboard on my lap with the mouse on a small table at the side of the chair. In this position, which was almost supine, I had little pain, and my weight was distributed through the length of my body rather than vertically down my spine. The point being would it be possible to engage in VR in a supine position using the movement mechanic I mentioned at the start and be able to disengage the lizard brain's monitoring enough to again suspend disbelief.
Thoughts?
I'm just putting this out there and chose the General Development forum as you guys are the ones who likely have answers or could figure out a way to achieve it.
I realise on one hand I'm suggesting people need recliners to enjoy a god VR immersion, but we are at a stage where we haven't cracked this movement conundrum so anything is worth mentioning I figure.
Lastly a question re: Dev kit 2, does anyone know that if you place the camera at slight height above your head (a foot or more) if this would aid in getting near to 360 degrees positional tracking? Or would it throw the positional tracking off if the camera is not at eye level?
Regards
StuPer
10 Replies
- StuPerHonored GuestI'm a little disappointed this topic has not had a reply, did I miss something?, I'd have thought this was something people would be keen to address.
I obviously cannot speak for anyone else but I find exploring VR worlds 'on foot' as though I'm on rollerskates or in a Segway really immersion breaking, and had thought simulated walking/running would help engagement hugely. - BoffExplorerWell I didn't reply because I didn't really have anything to add to the conversation.
In simplest terms are you talking simulating the effects of moving (e.g. head-bob)?
As to current implementations, unless you have a treadmill to use there's always going to be a disconnect when a character moves on foot and you are not (physically), not matter how well implemented. Even if you're in a floatation tank I think you'd still feel that disconnect.
Unless developers make everything a seated experience (cockpit, car, wheelchair, etc.) as you have mentioned, then I can't see how this can be resolved. Once the treadmills and body trackers are in our hands then maybe we'll have a better solution. - cyberealityGrand ChampionI think people get scared when they see huge walls of text and keep moving.
If you can express your idea more concisely, and with better paragraph spacing, more people would reply. - StuPerHonored GuestHey Boff, thanks for the reply. I understand that there is a sense of dislocation when your sitting down but the on-screen character is running around. My argument is that the current 'floating' experience is even worse than the 'head bob'.
I'm not asking people to simply put the 'head bob' back in. But to look at the actual physical mechanic of what happens to the head when we translate from standing still to walking to running and back again, then to simulate them in-game. But as I said in the original text I don't think that would be enough, as was pointed out in the recent talks we apply internal dampers to maintain a relatively steady image (but not rock solid), this is what I think we need to emulate in game. If done correctly I think this would allow a suspension of disbelief and even though we are sat down, still experience the sensation of movement.
I do understand that hapatic devices like treadmills and body trackers will be far, far better at this, but even so, not everyone wants or is capable of running around in these devices. In fact, I would go so far as to say that most people will not fork out for those devices but want to continue to experience they're VR sitting down.
...and fair comment Cyber - SqorckHonored GuestI would say there is really no way to know how well this would work unless you try it. That being said, I think that Boff is right about there always being a noticeable disconnect when your visual cues are simulating motion that your body is not experiencing.
You could of course make it very subtle and highly polished as for the accuracy of the simulated motion but in the end its going to be the same odd feeling you get when your in game avatar moves their body and you didn't. - StuPerHonored GuestHi Sporck,
I completely agree, that there will always be some disconnect, but as I said it has to be better than the floating ghost scenario at the moment.
As for how this could be achieved, well I've had some thoughts.
1. Take 2-3 volunteers of different body builds (skinny, medium, and large) put a headstrap on with 5 motion capture balls front, back, sides and top.
2. Get some good eye tracking cameras and obviously a motion capture camera.
3. Get each volunteer to transition from standing still, to walking to running and back. Also get them to go straight from standing still to running, and back.
4. Measure the head movement for each subject and how the eyes responded to that movement.
Then the tricky part, subtracting the percentage of the head movement that the eyes eliminate.
I don't think the initial figure would need to be exact, because a little experimentation with the same volunteers would probably get the correct figure.
Then I guess you could simply use 3 settings for users to pick, or create a scale based on weight that would adjust on user input.
Possible?
Regards
StuPer - SqorckHonored GuestI would say it would be possible, but I am not sure the effort vs result would be worth it. You always have to keep in mind what is going to be a good use of the precious development time you have. In my opinion I believe there are areas that could have a greater impact with less work. I do like bringing ideas like these to the table; discussing them and at least talking about the concept.
- TgaudHonored Guestas you can't stimulate the inner ear, if you want to avoid motion sickness, you have to avoid movment that usually make the inner ear react.
Thats why sliding is well supported.
Head bob is not and create sickness. - StuPerHonored GuestAhh ok, I guess I may have my 'rift legs' then as I can play some of the older fps games with head bob and not get any motion sickness, but the head bob itself is very unrealistic and of course the player moves far too fast. I have also heard people complain of nausea with strafing but again I'm fine.
Maybe I'm just asking for a solution that feels right and is ascetically pleases to me but would harm the majority.
Thanks for the replies, I'll just put this on the back burner then, or at least until most people have learned to tolerate that dislocation between perceived motion and reality.
Regards
StuPer - StuPerHonored GuestI had another 'moment' in the shower this morning considering what maybe the reason why some people are susceptible to motion sickness and others are not to a varying level.
Now it may or may not be relevant but I wondered if having the nose in your peripheral vision in anyway contributed to this, its always there, and offers a point of stable reference in your field of view regardless of what you are doing?
By extension would offering a stable 'nose' in-game offer a chance to reduce motion sickness in those susceptible?
I'm just throwing this out there as M. Abrash said there is so much we don't know, so maybe one of these left field things will be useful.
regards
StuPer
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 4 years ago
- 10 months ago