Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Butters's avatar
Butters
Honored Guest
12 years ago

VR/physical hybrid interfaces + Leap Motion hand tracking

Hello all,

For my CAD data visualization VR workspace project, I am faced with the need to use complex and precise interfaces like a 3D mouse (sometimes also called "space mouse"; it looks like this) inside a VR space. This gives much finer control than any motion controller available right now - unless maybe going 2-3 orders of magnitude higher in terms of cost (I still plan to use motion controls but in conjunction with this more precise control device).
The discussion below actually could apply just as well to the use of a regular keyboard or any other comparable physical interface device while in a VR environment.

The problem arises when you try to access all these buttons while inside VR. It's two-fold :
1. You need to know where the meatspace interface is located while seeing only VR space
2. You need to know where you own hand is located relative to that interface

1. is relatively easy to solve : you put a VR replica of the physical interface at the same location relative to the camera (HMD). In the application I'm looking at, the user is seated and does not make use of head positional tracking, so it's easy to achieve - just enter the right coordinates for the VR model and you will have a precise recreation of your input device.

2. is trickier. You of course have direct haptic feedback from the physical device you are touching, but you will still be fumbling around for the right buttons because you can't really visualize where your fingers are. You can have the buttons on the VR device light up upon presses on the physical device and get a sense of where your hand is, but it's a posteriori feedback - what I want to do is know it instinctively before pressing anything.
You don't want to be wearing sensors on your hand that would interfere with your usage of the device. You also don't want a 3D model of your hand covering your view of the device itself (it's also a problem in meatspace : you can't see the buttons that are under your hand).
My tentative solution is the following : set up a Leap Motion device (or similar) directly above the input device, pointed down at your hand, and use the finger tracking data to display (semi-transparent) representations of your fingertips on top of the VR device.
If that works, you can have some interesting effects. For example, if I hover a finger over a specific button, I could have a contextual menu pop up showing me what that particular button does before I even touch it.

Extending to non-physical interfaces :
Assuming I can get the above to work, I'd like to take this further and get the possibility to call up virtual buttons in 3D space around the VR/physical hybrid interfaces. Those will have no physical equivalent, and appear as "holograms" to the VR user. They can be pressed with the Motion-tracked fingertips and could be located anywhere in the Motion detection field, which should cover a reasonable amount of space around the default hand placement. Since these menus can be completely contextual, you'd probably only need a few buttons at any given time, allowing to make them pretty big and not to hard to hit consistently. Those will have no haptic feedback... unless you can implement some crude form of it through a "rumble pack" (think vibrating game controller) somewhere around the user's wrist or even through their elbow if it's resting on the chair arm.

As of today this is nothing but ideas, especially since I do not own a Leap Motion yet. That device and its capabilities are the biggest unknown to me at this moment. I'd love to hear from someone having a bit of experience with the Leap if what I am describing here seems feasible with that piece of technology, or if I'm daydreaming. I'd also of course love to know what you guys think about the idea in general, independently from hardware. Are there such implementations in place already somewhere ? Would you find them useful ? What would you want from such a UI ?

2 Replies

  • I like your ideas, but as a leap developer let me clarify something: the leap does not perform well upside down. What I mean is if the background is close behind the hand, the current leap software is incapable of determining your fingers' location because it can't separate your hand from the background very well. Future SDK versions plan to improve on this.

    That's not to say that we still can't use many of your ideas; they are very good! I suspect you will be able to do the in-air "tool tip" by placing the leap at the crest of the wrist-rest right before the joystick. If you hold your hand above that, the leap's wide field of view should be able to see all of your hand and each of your fingers up to a couple inches above pressing each button. Experimentation is key here, so you'll find out what works best when you get ahold of your device in two weeks.

    The second idea is also a very good idea, but it is severely hampered by the fact that there is no positional tracking on the rift yet. Your rift does not know where it is relative to the leap, so it is difficult getting all of that lined up and feeling natural. I've been working on a positional tracking system but I'm tempted to wait until Sixense gets their solution on the market.

    I've also experimented with other forms of leap/rift integration that side step the positional tracking problem.
  • Butters's avatar
    Butters
    Honored Guest
    Thank you for the very informative answer, and the kind words :)
    I didn't think of the fact that the Leap needs significant z-depth behind the fingers to track them correctly, but it makes sense that it wouldn't work well with you fingers directly above a surface like that of the controller. I do hope further versions of the Leap SDK can mitigate the problem but that sounds like there's some level of technological mismatch for this application... Oh well, I've ordered one anyway so I'll try and play with it regardless. Positioning it on the controller itself looking up / horizontally sounds difficult without destroying the ergonomics of the 3D mouse itself (you do need the palm rest to, well, rest your palm).
    Maybe an optical tracking solution would be more appropriate ? A ghetto solution would be to paint each nail of the left hand a different bright color or use a colored glove. That's certainly not the most user friendly approach but could be a start. I assume visual color-based finger tracking will be mostly 2D (little to no depth perception), but could still give access to "holographic" simulated interfaces around the physical device with the right camera angle (maybe looking over your shoulder ?). There was this interesting prototype of a "sixth sense device" (no relation to Sixense) a while back that did something similar with projected interfaces and colored finger markers. The problem would be that if you're unable to reflect depth properly on the virtual fingers position it's going to look very weird to the user (who can see depth with the Rift). Other finger-tracking solutions like Datagloves and multi-camera optic systems seem out of the question for this ; you don't want to have to wear a glove and interact with a physical device with it, it would be cumbersome ; and the whole point of the system I'm planning is to keep costs down, so fancy expensive trackers would kind of ruin it.

    On the other hand (no pun intended), it's great to see in your Rift+Leap integration how quickly the tracking of fingers is picked up when they come in/out of the effective tracking field. It looks very responsive, which is great.
    Great work with your visual positional tracking too. It also looks much more responsive than I would have expected. I'd still wait for the Oculus and/or Sixense solution as I suspect it will be quite good enough and become a de facto standard.

    EDIT : thinking a bit more about your Leap in the palm rest idea, it seems to make much more sense than I initially gave it credit for. I seem to have underestimated the field of view of the Leap (advertised at 150 deg : is that a realistic figure ? In both directions ?). Maybe I can get the Leap slotted into the forward-facing slope in the palm rest, raising it a bit if necessary (or ditching the Leap Motion's plastic case to save space). that should get all buttons except a couple into the field of view, and be oriented just right for floating interfaces.
    Did you / can you try and test the Leap Motion performance when it is located directly under your palm, angled about 30 degrees up from the direction of your forearm ? Maybe placed at the lower edge of a keyboard or something ?
    That's something I need to add to my list of tests for when my unit arrives.