Forum Discussion
dudedudeizkool
11 years agoExplorer
Why LEAP did not succeed & what oculus should learn from it
This is a matter i realised from the very first LEAP mOTION demo i saw, and it is the reason i feel it didnt succeed. In a nut shell ill losley quote Steve jobs.
Its not good enough to be great, it has to be rediculously awesome.
The main thing is expectations, and i know for example there a alot of people like a friend of mine who instantly dismiss a technology if it does not meet their excpecations. In short. Reality. It has to be rediculously awesome and the latency and sence of presence thing is important but equalily important is the input, hand input because in reality it is, apart from seeing, the way we interact with the world.
The leap failed because it was a glorified "air mouse". The motion was relative to the controller. Not the persons eye. You moved your hand to the right untill you reached the icon and stopped. Instead of moving your hand relative to your eye untill your finger tip, the icon and your eye line up. This is the natural way of interacting. Hold your hand out and move your head and where you are pointing changes. So this natural hand input was hardly natural at all.
This is where oculus should really really be attentive. If i can not reach out and grab something its not natural. and as dishearted at man kind as i am about it many many will fault oculus on this alone. Its not good enough to be great. its has to be rediculously awesome! What will oculus have that gear Vr or Vive wont?
Get a deal with leap, get it intergrated, get those hands on screen, you have head tracking now and then you know where the users hands are in corrilation to thier head (eyes) and let us reach out into this reality. Any movie with VR the first time anyone puts the thing on, where do they look? they look around and or they hold up thier hands and look at them in VR.
I wish Oculus all the best and a great start as a pioneer of the likes that the tech industry hasnt seen since the birth of windows. i truely look forward to buying a rift and... :-P holding up my hands and looking at them.
many kind regards
Peter
Its not good enough to be great, it has to be rediculously awesome.
The main thing is expectations, and i know for example there a alot of people like a friend of mine who instantly dismiss a technology if it does not meet their excpecations. In short. Reality. It has to be rediculously awesome and the latency and sence of presence thing is important but equalily important is the input, hand input because in reality it is, apart from seeing, the way we interact with the world.
The leap failed because it was a glorified "air mouse". The motion was relative to the controller. Not the persons eye. You moved your hand to the right untill you reached the icon and stopped. Instead of moving your hand relative to your eye untill your finger tip, the icon and your eye line up. This is the natural way of interacting. Hold your hand out and move your head and where you are pointing changes. So this natural hand input was hardly natural at all.
This is where oculus should really really be attentive. If i can not reach out and grab something its not natural. and as dishearted at man kind as i am about it many many will fault oculus on this alone. Its not good enough to be great. its has to be rediculously awesome! What will oculus have that gear Vr or Vive wont?
Get a deal with leap, get it intergrated, get those hands on screen, you have head tracking now and then you know where the users hands are in corrilation to thier head (eyes) and let us reach out into this reality. Any movie with VR the first time anyone puts the thing on, where do they look? they look around and or they hold up thier hands and look at them in VR.
I wish Oculus all the best and a great start as a pioneer of the likes that the tech industry hasnt seen since the birth of windows. i truely look forward to buying a rift and... :-P holding up my hands and looking at them.
many kind regards
Peter
12 Replies
- BlackFangHonored GuestIt sounds like your problems with the leap all come from bad programmers integrating it into the games. Shouldn't be too hard to mount the leap onto the front of the Rift and do some math to ensure things line up correctly. I also don't see how any of it applies to the Oculus Touch.
- nosys70Expert Protegethe problem with leap is it is made to lay flat next your keyboard.
neither position or distance are optimal if you mount it over the rift, so the average result.
it is a great thing that leap was open enough to accept a different use of the device , but it is still not designed for this.
i think now leap has a bright futur since oculus obviously dropped the ball on that side.
they just need to launch a device that is made for the rift. - dudedudeizkoolExplorerat the time of writing i wasnt aware of the oculus touch, nor aware this forum was exclusivly for the Oculus Touch. BUT is was good to see a input device in the works which is more natural than an xbox controller :-)
And yes getting things to line up is not hard. Any head tracking, even a webcam, after calibration would do the trick.The are libraries for Compiz fusion and the like prebuilt but its all a little hacked in there. For commertial leap i understand why a single sleek small device had its appeal. But the lack of head tracking is the make or break feature. All reviews of the LEAP say, "kudos for an awesome try but just not there yet". And it would be a shame to have the rift reviewed that way.
so ingeneral lets say, hand tracking and interaction made possible by devices like LEAP for example. A "VR glove" would obviously also do just fine. I know leaps product not only lacks in the hand eye coordination but also that the camera cant see a vertical hand, the camera capturing only one plane. The leaps motions volumetric "field of view" should be fine, in the first person as it doesnt really matter where your hands are when they are off screen. But if you have a vr enviroment counterpart that would kind of require a sort of STEM system or other way to poject your foll body. Awesome, but for a start good, seamless, natural interaction in a FPS situation is great. Things like the virtualiser are another topic and probably beyond the scope of most. Living space and cost wise.
i would say, in terms of launch controllers, "reliable full digit tracking by glove or otherwise" > Oculus Touch > current buggy LEAP > Xbox controller. - alexcolganProtegeLead writer from Leap Motion here; just thought I'd clarify a few points :D
"BlackFang" wrote:
It sounds like your problems with the leap all come from bad programmers integrating it into the games. Shouldn't be too hard to mount the leap onto the front of the Rift and do some math to ensure things line up correctly.
You're right, the math is pretty straightforward -- the real challenge is on our end, because the tracking software (a) makes assumptions based on how it thinks your hands and arms will be oriented, and (b) becomes less reliable when it has to work through lots of background interference (which is way more likely with a head-mounted device than if it's pointed up at the ceiling). Fortunately, we've gotten a lot better at it even in the last few months, and there's a lot of room to grow."nosys70" wrote:
the problem with leap is it is made to lay flat next your keyboard.
neither position or distance are optimal if you mount it over the rift, so the average result.
We actually designed it to potentially work along a couple of different orientations, but it's true that the device is general-purpose. We have specially designed prototype modules for VR that have some significant advantages over the peripheral -- including massively increased FOV, higher-res cameras, and wider inter-camera distance -- but ultimately the key to a really compelling and natural user experience lies in our efforts on the software side.
Our goal is to get it to the point where it can equal or outperform a human in interpreting any hand movement or pose. That's on the horizon, but we're getting a little closer with each build we release. - MadarasExpert ProtegeKeep up the good work LEAP. We can't do VR without hard working peeps like yourselves.
- dudedudeizkoolExplorer
"Madaras" wrote:
Keep up the good work LEAP. We can't do VR without hard working peeps like yourselves.
I totally agree innovating a new product is not without challenges and i very much respect the work put in by anyone in a cutting edge field. Even more so that people stick with it and don't simply write it off if it doesn't work out perfectly the first time.
I followed LEAP before its release for AGES i was pretty excited had saved up the money to buy. But alas :( i didnt. i still would though. I am a fan of LEAP. Infact i even emailed LEAP to ask about the hand eye coordination "problem". the support person didn't understand at all and refered me to Oculus. Im guessing head tracking means an instant referal here :-P
LEAP for VR and leap for desktop are indeed pretty different. A static white ceiling definatly produces less noise, didnt really consider that. Though that doesn't change the short comings of seeing fingers only on a horizontal plane and that when you use it on a keyboard environment you are not actually POINTING to the mouse pointer on screen. which infact i find more important than reliable 10 finger tracking. as it stands you can't use a LEAP without a mouse pointer. You cant point to an icon.
Leap for VR doesnt actually need the finger to line up to anything. there is no fixed monitor. But it does need to know where your hand and digits are relative to your head. Which (if one stood infront of white wall) should not be too difficult.
You probably could calibrate it for one time use, but without head tracking you need to recalibrate everytime you move your eyes in volumetric space (not viewing direction). - mptpExplorerIf we (the tech community at large) can manage to get IR depth sensors small, cheap, light and power efficient enough, we'll have the hand tracking of our dreams no problem:
Mount two to the top and bottom or a controller like Oculus Touch / STEM / ViveControllers (with industrial design modifications obviously) such that they can see the fingers in all orientations, if the hand is angled up or down. You already know the wrist position from however the main controller is being tracked, and the IR depth cameras give you the finger skeletal positions in respect to the wrist.
Tada - perfect optical finger tracking in world space with no occlusion issues. :) - dudedudeizkoolExplorer
"mptp" wrote:
If we (the tech community at large) can manage to get IR depth sensors small, cheap, light and power efficient enough, we'll have the hand tracking of our dreams no problem:
Mount two to the top and bottom or a controller like Oculus Touch / STEM / ViveControllers (with industrial design modifications obviously) such that they can see the fingers in all orientations, if the hand is angled up or down. You already know the wrist position from however the main controller is being tracked, and the IR depth cameras give you the finger skeletal positions in respect to the wrist.
Tada - perfect optical finger tracking in world space with no occlusion issues. :)
;) Tada. it would be awesome if it were that simple.
ive speculated with the idea of a sort of multi camera system. the vive uses 2 cameras in the corners of the room, im guessing alot like the kinect. (but that wont do for hands) and to augment the whole skeleton with like a LEAP device. But that still has the disadvantage of the simulation would not know what your fingers are doing outside of the field of view of the leap. Hence dropping an object while not looking at it would be impossible. In this respect the oculus touch actually does a good job with the pinky trigger.
the oculus touch with its IR emmiter array in a glove form could be great combined with an array cameras (im think 2 would do for x and y planes, z being calculated). But i have friends who absolutly hate the idea of having to wear any periferals (especially the kind that prevents them eating chocolate :-D. i work with gloves a lot. so i dont have a problem with them. - MrKaktusExplorer
"dudedudeizkool" wrote:
the vive uses 2 cameras in the corners of the room, im guessing alot like the kinect.
That's not correct in fact. These are base stations that emit IR. Each controller and HMD is tracking it's own position in space using it's sensors located on it. The time difference between IR hitting the sensor and their fixed location against each other allows each device to position itself in 3D. - dudedudeizkoolExplorer
"mrkaktus" wrote:
"dudedudeizkool" wrote:
the vive uses 2 cameras in the corners of the room, im guessing alot like the kinect.
That's not correct in fact. These are base stations that emit IR. Each controller and HMD is tracking it's own position in space using it's sensors located on it. The time difference between IR hitting the sensor and their fixed location against each other allows each device to position itself in 3D.
my bad.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 9 months ago
- 5 years ago
- 3 years ago