Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
ElazarGer's avatar
ElazarGer
Honored Guest
12 years ago

VR Interface with handsfree input

Hey,

is anybody trying to implement a virtual interface or HUD with the Rift in combination with a leap controller? For example similar to Mass Effect. You look at your arm ingame and get an interface projected on your arm. With the leap you are tracking the other hand and show that ingame, too. Or something similar with a floating interface. Like a keyboard in the air. That would be awesome for games.

When I receive my dev kit I will give it a try. But I am sure other guys are more capable in doing that.

5 Replies

  • I'm planning to implement that in the first game I'll be working on, but it will be combination of some game controller with Leap Motion (you'll use the Leap to manipulate some elements around you and select targets on the screen and use sticks on the gamepad and buttons for other actions).
  • Anonymous's avatar
    Anonymous
    Not exactly sure why you combine Leap motion with "hands free". It is for use with your hands.

    Anyway, I played with Leap and one thing you notice immediately is that without something like the Oculus, you can't tell where your hands are in the virtual space unless you are close to something projecting the shadows of them as a point of reference.

    Another thing to point out is that the skeletal tracking is not good enough to keep a consistent understanding of your hand structure (which finger is which) and you can't turn it to the side or upside down, etc.
    Originally, I was going to map the in game character's fingers to those as seen by the leap but I only got as far as faking it only up to the wrists.

    As for the menu systems I think HUDs and interfaces should be in 3D now. I prototyped all these basic concepts within a few days getting my Leap Motion but stopped until I get my Oculus Rift. You can see it at http://www.youtube.com/watch?v=3d3nX9Jf ... B3f8CJAYeA

    There aren't many demos for Oculus yet, so perhaps I'll put one together where you can move and see the character's hands (with or without the Leap Motion) and as in my other video also driving and flying around to see what that may be like. Only problem would be my framerate is pretty low due to unoptimized car details, physics, etc. so we'll see.
  • Anonymous's avatar
    Anonymous
    I have been thinking of trying something like that too, ever since I found out about the Leap Motion, though I'll have to wait with that until I can actually get my hands on one.
    I'd like to do some experimentation with it first, to see what I can do with it, like finding the boundaries of what it can pick up.
    Once I figure that out, I'll see if any of the ideas I've got can actually be done.
    Looking at that video harleycw linked, I've got a pretty good feeling about that though.
  • For typing on a keyboard in the air, in my experience the Leap Motion isn't there yet. Losing fingers or having phantom fingers is an issue -- although much improved with the latest SDK. Hopefully they will get there. Although having to constantly hover my hands over a stationary device without being able to rest them doesn't appeal to me, but that may just be me.

    As for a device on your arm that you can look at and interact with, I don't think you'll be able to get away with anything other than having your arms face straight out from your body. And the Leap Motion wouldn't be able to track your other hand touching buttons on it. If you did want a virtual device on your arm that you could hold up and look at, you may be better off with a Razer Hydra. You still couldn't type with your fingers on the virtual device, but you would be able to look at it in various ways and use the buttons on the Hydra to interact with it.

    But don't get me wrong. I'm looking forward to seeing how people combine all of these devices and discover what works and what doesn't. That's why we're all here. :D

    - Dave
  • Sometimes I think I must be the only one who wants to stick leap or leap like device to the face of rift in order to allow you to read hour hands whilst interacting with what your looking at. I'm not sure of the leap can get a clear reading of your fingers from that angle, but if you could extrapolate the positions of the hands and arms you could figure the general position of the rest of appendage off screen. Being able to look at a keypad on a wall in game, reach out see your hand then type on it would be a huge step. Same goes for opening doors and drawers or anything else you would operate manually whilst looking at. It would take hidden object genre into whole new territories of experience. :twisted:

    :EDIT:

    Looking at the concepts for the consumer rift (with 2 cameras + IR LEDs) This may very well be the route they wana take! :mrgreen: