Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
zalo's avatar
zalo
Explorer
12 years ago

Tutorial: Rift Mounted Leap Skeletal Tracking in Unity

This is meant as a quick start guide for getting the new Skeletal SDK up and running for Rift Mounted Leap Controllers. Hopefully this will enable developers to experiment with radical new modes of gestural input; I'm excited to see what everyone does with it!

After you (use tape or superglue to) affix your Leap sensor to the front and center of your headset, watch this video to see how to get the integration working:



1. Get the Skeletal SDK: https://developer.leapmotion.com/downloads/skeletal-beta

2. Download the Quickstart File: https://app.box.com/s/n6i7uhpn1uy83q0hg8u5

3. Import all of the file into your Oculus Project

4. Attach the HandController prefab in the Assets directory to the Right Camera in your Oculus Rift Camera.

6. You're done!

NOTE: Make sure Optimize for Top Down Tracking and Auto Orient Tracking are ticked in your Leap Tracking Settings. Tracking is also spotty if you're looking at your desk, so try looking (sideways or backwards) into open space and you will just much better tracking results.

Check out the other Leap examples included in the Quickstart file for ideas and code.

7 Replies

Replies have been turned off for this discussion
  • Awesome, i was just about to start working on something like this in a couple days. Thanks!
  • Nice. Looks pretty simple.

    How do you feel about the Leap? I just tried the v2 SDK (with the device on a desk) and the tracking still left much to be desired. Especially when my hand was thumb up or palm up, the tracking seemed to fail.
  • Anonymous's avatar
    Anonymous
    I also integrated v2 SDK last week with my Rift project and did also find the tracking to still be a little erratic. I posted on the Unity forums for the v2 thread asking if the announced SDK today was different then the v2 SDK I downloaded last week because the videos on all the major news sites show a better level of tracking than I could achieve by just running the demos that came with the SDK.
  • Like I mentioned in the post, you have to look away from your desk to get the really good tracking. Even with the "Optimize for Top-Down Tracking" checkbox enabled, it was nearly useless while looking facing my desk and monitor. Facing away into open space, reliability of gestures went from 50% to 90%. So it will be really useful for doing things like navigating menus or fine spacial interactions.

    When it's on your table, you just have to make sure there's nothing in your environment (above your hands) so it can get a really good silhouette (the light it emits is really bright). My desk is under my bunk bed so sometimes the light from the leap will reflect off of my bedsprings and cause a little trouble (but putting a towel up there fixed the issue(interestingly, even black towels appear bright white under infrared)).


    Another trick I recently learned is that you can detect when the thumb is touching the side of your hand, and have that activate an event on your index finger. So it lets you move around with precision and click items on the screen. This gesture is extremely reliable.


    The current version ends in 15831 so check to see if your version is below that.
  • I've been able to give this a shot today and thanks again for the quick setup guide!
    i'm just having one issue where my hands always show up upside down, (palms facing up when their actually facing down) is there something that i should be adjusting to invert the direction? (note: i've already checked the optimize for top down tracking checkbox in the control panel)

    Thanks for any help!
  • Try the newest SDK version; their latest update focused on making hands initialize in the right pose and it works pretty well now.