Forum Discussion
ziphnor
13 years agoExpert Protege
Unified framework for handling motion input?
Hi,
While being very excited about the Rift i am also a bit about how new first person avatar based games will be able to incorporate various motion tracking in a consistent manner. From trying the Rift my impression is that the sense of your body being there adds tremendously to the immersion, but only in a limited manner if it doesn't match up with your own movement.
For example, right now we have the Rift with its rotational head tracking as well as the hydra (usually for tracking hand movements). If i create a new game right now ( i wont, because while i am a developer, i am not a game developer (though the Oculus makes me want to be ;)), i would probably add support explicitly for the head tracker and hydra. But what if someone adds a kinect to the mix to track walking, rotational tracking of the body (as in walking direction != view direction), some sort of ODT (Omni etc) that might provide detailed tracking of the feet movement etc? Each game will have to do individual support for handling this sort of thing. That includes handling merging the tracking information as well as supporting each individual device as well as coming up with a control scheme for interpreting the tracked data (ie is the hydras being used for hands, or one for body one for an arm etc).
In general i am worried that we will get a lot of games supporting for example the hydra, and then someone comes up with a cool new way of doing motion tracking, and none of the existing games will support it. It would be nice to route this trough a a framework, so new tracking information can be added and consumed easily.
This is already an issue now, games might support independently aim and look, but since there isn't really a good way of tracking walking direction, they might not support independent movement direction as well even though its not that tricky from an implementation point.
From the game point of view i would argue that it really just wants someone to give it a representation of position of the players body (skeleton or possibly more than that), not caring how much is really being motion tracked or how. Ie. if it so happens that the player has a means to track his feet/hands then fine, if not, then it is also fine, the movement just being emulated by keyboard input through the framework.
Of course, additional tracked items like weapons could also be supported, but will of course be potentially more specific to the type of game, and one might consider certain body position shortcuts like crouching etc that doesn't necessarily require full motion tracking.
So, are there any such framework (i am sure i am not the first one to consider this)? I noticed OpenNI/Nite, but i don't know if it is general enough, or if it is too tied to the concept of kinect-like cameras. Or perhaps it is a stupid idea altogether because it provides too much of an abstraction? Or perhaps Oculus will just come up with its own full motion tracking and integrate support for it in their SDK ?
While being very excited about the Rift i am also a bit about how new first person avatar based games will be able to incorporate various motion tracking in a consistent manner. From trying the Rift my impression is that the sense of your body being there adds tremendously to the immersion, but only in a limited manner if it doesn't match up with your own movement.
For example, right now we have the Rift with its rotational head tracking as well as the hydra (usually for tracking hand movements). If i create a new game right now ( i wont, because while i am a developer, i am not a game developer (though the Oculus makes me want to be ;)), i would probably add support explicitly for the head tracker and hydra. But what if someone adds a kinect to the mix to track walking, rotational tracking of the body (as in walking direction != view direction), some sort of ODT (Omni etc) that might provide detailed tracking of the feet movement etc? Each game will have to do individual support for handling this sort of thing. That includes handling merging the tracking information as well as supporting each individual device as well as coming up with a control scheme for interpreting the tracked data (ie is the hydras being used for hands, or one for body one for an arm etc).
In general i am worried that we will get a lot of games supporting for example the hydra, and then someone comes up with a cool new way of doing motion tracking, and none of the existing games will support it. It would be nice to route this trough a a framework, so new tracking information can be added and consumed easily.
This is already an issue now, games might support independently aim and look, but since there isn't really a good way of tracking walking direction, they might not support independent movement direction as well even though its not that tricky from an implementation point.
From the game point of view i would argue that it really just wants someone to give it a representation of position of the players body (skeleton or possibly more than that), not caring how much is really being motion tracked or how. Ie. if it so happens that the player has a means to track his feet/hands then fine, if not, then it is also fine, the movement just being emulated by keyboard input through the framework.
Of course, additional tracked items like weapons could also be supported, but will of course be potentially more specific to the type of game, and one might consider certain body position shortcuts like crouching etc that doesn't necessarily require full motion tracking.
So, are there any such framework (i am sure i am not the first one to consider this)? I noticed OpenNI/Nite, but i don't know if it is general enough, or if it is too tied to the concept of kinect-like cameras. Or perhaps it is a stupid idea altogether because it provides too much of an abstraction? Or perhaps Oculus will just come up with its own full motion tracking and integrate support for it in their SDK ?
5 Replies
- jhericoAdventurerKhronos (the organization behind OpenGL and other open APIs) is working on something called Streaminput which appears to be an open API for what you describe, or at least a stepping stone on the way there.
OpenNI is actually a PrimeSense driven thing, hence it's focus on the Kinect and PrimeSense cameras. Softkinetic has IISU, a competing platform, but is also a member of the Khronos group working on Streaminput. - ziphnorExpert Protege
"jherico" wrote:
Khronos (the organization behind OpenGL and other open APIs) is working on something called Streaminput which appears to be an open API for what you describe, or at least a stepping stone on the way there.
Excellent, that looks to be almost exactly what i am talking about (they specifically mention positional sensor fusion and gesture and motion detection), and i see alot of prominent logos there (assuming a logo means that they are part of the group) like Razer, PrimeSense and nVidia. It sounds like its still very early times for it, but that kind of makes sense i guess, its only recently that it has really become relevant. No Oculus logo though, but hopefully they would support this kind of initiative as well.
Thank you for pointing it out, i am a little bit more optimistic about the future of motion tracked gaming now :) - ziphnorExpert Protege
"Tbone" wrote:
There is an ongoing discussion here about the best unified positional tracking scheme for Oculus to support. I agree that there should be a unified standard or, at the very least, a unified framework.
Thanks, i hadn't noticed that the thread was actually about a general framework, i thought it was just about handtracking hardware/software from Oculus :oops: Sorry, i didn't mean to hijack the topic from that thread.
However, i am very much interested in how it applies beyond just tracking hands :) ( i am really looking forward to running around in a virtual world). - TboneProtegeNo problem. We're kinda both asking for the same thing in a different way. My argument was that positional tracking of the head and hands should be tied together for both calibration purposes as well as development purposes. I believe your argument is that there should be a framework so that any motion input device will be compatible with Rift games, which I also agree with.
Ideally your head, hands, feet, or whatever will all be tracked using the same method, and that needs to be standardized by Oculus.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 15 days ago
- 5 years ago
- 2 years ago