cancel
Showing results for 
Search instead for 
Did you mean: 

Unified Head and Hand Oculus Positional Tracking

Tbone
Protege
First off, let me say I have no idea HOW this could. I just have an idea that someone should figure it out!

Oculus is working hard on positional tracking for the head, but should they be focusing on hands at the same time? Right now there are all of these different ideas for controllers/gloves/motion tracking. What if you could use any controller or device you want and STILL have positional tracking for your hands that are ALSO relative to your head.

I got to try the Rift with the Hydra, and one frustrating part of that experience was that my virtual hands in the Tuscany demo were not exactly where my real life hands were. I'm sure there are several reasons why this is the case, but the reason I can think of is that the Rift wasn't be tracked, so where my head is located relative to where my hands are located isn't matching up. If the same positional tracking was being used for the Rift and my hands, however, I imagine it would be very easy to match up the hands being exactly where they are supposed to be in relation to my head. But how do you do this and still allow the use of any controller? My idea is positional wristbands.

You could put a wristband on each wrist, making sure that the logo lines up with the center of your wrist. These could be wireless or there could be long wires that plug in directly to the Rift. Assuming that whatever positional tracking solution for the Rift could also work on these bands, you could track the head and both hands in virtual space and in relation to each other. The bands could even track rotation and other movements of the hand. The best part is that this still leaves the hands free to use the input device of you/the games' choice.

Using this method, you could aim with an Xbox controller. You could have a game that could combine the Leap Motion and Oculus' positional tracking. You could still use the Hydra. You could use a glove. Or you could just use the bands (if it were, say, a boxing game). It's about creating an Oculus standard - all Rift compatible games can take advantage of positional tracking for head and hands, BUT it still allows the flexibility for a game to choose different input devices (you could even play mouse and keyboard and just fling your hand around for specific commands or to point at menu options).

Perhaps this idea has already been floating around. Or perhaps the next post will tell me why it's not possible. It's just an idea that, in as much daydreaming about the Rift as I do, I hadn't thought of before, so I thought I would share. Thoughts?
59 REPLIES 59

drash
Heroic Explorer
No idea as to feasibility, but I do like the idea of wristbands. Simple and stays out of your way to let the user decide what to actually hold in the hands, if anything.

jwilkins
Explorer
Would be interesting to use the LEAP as a means of occasionally, almost accidentally, calibrating the position of your hands when using the Hydra.




eshan
Honored Guest
I do think your head and hands need to be tracked against each other, or things go out of whack. In my recent testing, I put one Hydra wand and one in my hand, and the hand did seem to be mapped 1:1 to where I expected it to be. I have a video here if you're curious: http://www.youtube.com/watch?v=ko5OdLE2Msw

jwilkins
Explorer
I tried the Hydra/Tuscany demo after reading this thread and I had a lot of problems. Maybe I didn't notice in non-VR games, but maybe there is too much metal around the Hydra to be precise enough for VR. My impression was that the demo, to put it mildly, "sucked."

Combining some kind of visual/ir tracking with Hydra would definitely improve this because the fusion would also compensate for and distortions of the magnetic field.

barath
Honored Guest
Considering the next kinect is going to be so accurate that it will be able to read lips and track all ten fingers and even be able to tell your heart rate. I think they should focus on that. Then they can get your entire body into the game and perfectly track your every moment. Imagine reaching out with your hand and just grabbing something instead of holding the hydra or just jumping when you want to jump in the game. I think the kinect 2 should be the main focus for body interactions within a game.

jwilkins
Explorer
"barath" wrote:
Considering the next kinect is going to be so accurate that it will be able to read lips and track all ten fingers and even be able to tell your heart rate. I think they should focus on that. Then they can get your entire body into the game and perfectly track your every moment. Imagine reaching out with your hand and just grabbing something instead of holding the hydra or just jumping when you want to jump in the game. I think the kinect 2 should be the main focus for body interactions within a game.


It is pretty advanced, but I think you are overselling it. From what I've seen it can track wrist motion but not your individual fingers. Not ruling out that could patch it to do even that, but I don't think the hardware is capable of reading your fingers if you are sitting back on the couch.

barath
Honored Guest
"jwilkins" wrote:
"barath" wrote:
Considering the next kinect is going to be so accurate that it will be able to read lips and track all ten fingers and even be able to tell your heart rate. I think they should focus on that. Then they can get your entire body into the game and perfectly track your every moment. Imagine reaching out with your hand and just grabbing something instead of holding the hydra or just jumping when you want to jump in the game. I think the kinect 2 should be the main focus for body interactions within a game.


It is pretty advanced, but I think you are overselling it. From what I've seen it can track wrist motion but not your individual fingers. Not ruling out that could patch it to do even that, but I don't think the hardware is capable of reading your fingers if you are sitting back on the couch.



Even the first kinect can track individual fingers with the right programming: https://www.youtube.com/watch?v=tlLschoMhuE&feature=player_embedded

Tbone
Protege
"eshan" wrote:
I do think your head and hands need to be tracked against each other, or things go out of whack. In my recent testing, I put one Hydra wand and one in my hand, and the hand did seem to be mapped 1:1 to where I expected it to be. I have a video here if you're curious: http://www.youtube.com/watch?v=ko5OdLE2Msw

That's good to know. So the question is, what's the best way to combine tracking for head and hands? I like the wristbands because it frees you up to use multiple input devices, and it doesn't force you to where a glove. But I also like the idea of a unified Oculus controller as well - it just limits what the devs can do.

I guess it depends on what positional tracking solution Oculus is working on. For instance, if they are using a camera on the Rift for tracking, then I don't know that that system will work for wristbands as well. Whatever the system is, it has to have low enough latency for your head. What system can do that?

And as for the Kinect, it isn't 360 degrees, so it's already out by default. The Kinect would have to have X-ray vision or something to track you when you're facing AWAY from the camera. Or you'd have to mount it directly above you on the ceiling, which is unrealistic. *waves hand* The Kinect is not the solution you're looking for.

jwilkins
Explorer
"barath" wrote:
Even the first kinect can track individual fingers with the right programming: https://www.youtube.com/watch?v=tlLschoMhuE&feature=player_embedded


I'm assuming it can track fingers if it is close. My point is that if you want full body movement you have to trade off tracking individual fingers unless you up the resolution and Kinect2 is still not good enough to do that.

Also, occlusion.