Forum Discussion
xhonzi
11 years agoHonored Guest
Leap VR Passthrough with Hand Isolation?
I couldn't find an appropriate Leap VR forum, so I'll try this in the Oculus forum.
Has anyone gotten the Leap Passthrough/Isolated Hands stuff to work?
I have the demo and examples, and it works fine. I see that the hand controller is set-up to use no graphics model, and the rigidhand physics model. I see the quads on each camera, and they are set to layer out the other eye's quad.
I see the quads Are using a DistortMat material with the LeapMotion/LeapUndistorted shader. They are running the LeapImageRetriever script, and apparently have a dead link to a missing script (which might be important, might not- the demo runs as I expect it to).
At runtime, something is causing the quads to change depths as the tracked hands move about in Z space. I'm not sure what script is doing that, though it seems to be line 157 of the ImageRetriever script.
When I apply all I have observed into my own project, I can get any model of hands in front of me, so I know generally that hand tracking is working... but when I switch it to no graphic model, I don't see anything. The quads are getting a picture drawn to them, but it's not flesh coloured, it's not transparent (hands not isolated), and it doesn't move/scale along with hand movement, so it's not placed in front of the camera.
So... what gives?
Has anyone gotten the Leap Passthrough/Isolated Hands stuff to work?
I have the demo and examples, and it works fine. I see that the hand controller is set-up to use no graphics model, and the rigidhand physics model. I see the quads on each camera, and they are set to layer out the other eye's quad.
I see the quads Are using a DistortMat material with the LeapMotion/LeapUndistorted shader. They are running the LeapImageRetriever script, and apparently have a dead link to a missing script (which might be important, might not- the demo runs as I expect it to).
At runtime, something is causing the quads to change depths as the tracked hands move about in Z space. I'm not sure what script is doing that, though it seems to be line 157 of the ImageRetriever script.
When I apply all I have observed into my own project, I can get any model of hands in front of me, so I know generally that hand tracking is working... but when I switch it to no graphic model, I don't see anything. The quads are getting a picture drawn to them, but it's not flesh coloured, it's not transparent (hands not isolated), and it doesn't move/scale along with hand movement, so it's not placed in front of the camera.
So... what gives?
8 Replies
Replies have been turned off for this discussion
- alexcolganProtegeHey, I'd like to investigate this for you. Can you let me know which versions of the following you're using:
- Oculus SDK
- Leap SDK
- Unity (Free or Pro, Mac or Windows)
- Unity assets (most recent ones are built for Leap v2.2.0; there's a version for Oculus v0.4.3 and another version for v0.4.4)
- xhonziHonored GuestWow, you replied right away and somehow I missed it. Sorry about that.
So, I will confess to using early versions of some of this stuff, but I will first point out: it works in the demo scene. So if I could get the same functionality of the demo into my game, I would be very happy. I'm working on a game that is near release has already locked down its Unity version, so that's my alibi for not upgrading. And since I can't upgrade, I can't use the Pro version of the Leap Assets- since they are version constrained. Though they do seem to work fine once I import them.
Oculus 0.4.4
Leap App 2.1.6-e2f5a4
Leap SDK 2.1.5
Unity Pro Windows 4.3.3f1
and
LeapMotionVRAssets_Free from https://developer.leapmotion.com/downloads/unity- not sure what version, but the latest on 10 Dec.
Everything seems to work fine in the demo. There is the missing script, but I can't tell that any functionality is missing. Imageretriever works in the demo.
Imageretriever doesn't seem to work when I put it in my game. Not sure why. No script errors- just not the expected functionality. - xhonziHonored GuestSeems the missing script is ImageRetrieverTypes.cs. It's just a script that watches for key presses and changes some of the settings on the other ImageRetriever component.
The script (at least in some packages) is in DemoResources/Scripts, which wasn't in the VR Passthrough package I downloaded/copied from- so no big loss there. - alexcolganProtegeThanks, I've flagged the issue with our Unity team and will get back to you. In general, it's worth noting that we can't guarantee proper behavior with the OVR plugin before 4.5.5 (which is when Unity extended full Oculus support).
Update: We're not able to reproduce the issue; after double-checking the assets, ImageRetrieverTypes.cs is located in Assets/LeapMotion/Scripts.
Is it possible for you to update the Leap SDK to match the assets? You should be able to update the SDK and use our Unity Free assets: https://developer.leapmotion.com/downloads/unity Again, we can't guarantee that it will work with versions before Unity 4.5.5, but we think it should still work as long as your project doesn't use Widgets. - alexcolganProtegeAnother general troubleshooting tip worth noting for issues like this is to ensure that Allow Images is checked in the Leap Motion settings, and to check that images are displaying in the Visualizer.
Also, be sure to set "Overlay Image" properly -- if set to true, it will obscure the entire scene, while if set to false, it will be a background to all other objects in the scene. - xhonziHonored Guest
"alexcolgan" wrote:
Again, we can't guarantee that it will work with versions before Unity 4.5.5, but we think it should still work as long as your project doesn't use Widgets.
Haven't tried any widgets yet. I do understand there are no guarantees- thanks for looking into it anyhow. I will update the SDK and get back to you.
What I think I need to know is this: What are the basic steps for making Isolated Hand passthrough work? I think I'm just missing a step.
1. Add LeapMotion assets to scene.
2. Add HandController to CenterEyeAnchor- set graphics model to none, physics model to RigidHand
3. Add quads to Right and Left Cameras at 0,0,0- set them to layers that are excluded by the other eye's camera
4. Add Leap Image Retriever script to quads
5. Set Material to DistortMat with LeapMotion/LeapUndistorted shader
6. ... ?
7. Profit?
I do have allow images / Overlay all set up correctly- the Isolated Hands scene works for me just fine- I just can't get it to run in my own game. I do have the robot hands if I enable them- so I know hand controller is set up correctly.
Same Rift, Same Leap, Same Unity, Same code... - xhonziHonored GuestOkay,
To get the current Leap VR SDK/Assets to run in old Unity- I had to comment out some of the UI.
ScrolTextBase.cs: 2: using UnityEngine.UI;
ScrolTextBase.cs: 87: if (Mathf.Abs(content.transform.parent.GetComponent<ScrollRect>().velocity.y) > 0.001f)
ScrolTextBase.cs: 88: {
ScrolTextBase.cs: 89: content.rigidbody2D.velocity = Vector2.zero;
ScrolTextBase.cs: 90: }
And then it was running.
Next- I am sorry for possibly misleading everyone here (myself included) but the thing I'm most interested in duplicating and the thing I've copied from is not actually a scene in the Leap VR assets, but this one from git: https://github.com/zalo/LeapIsolatedHands
As seen working here: https://www.youtube.com/watch?v=c8uZHaoZl_w
So... passthrough with isolated hands. I'm not sure this git content still works with the latest Leap VR assets- since quite a bit has changed.
EDIT: Upon reflection (the mental kind, not the programming kind) I realized that not all LeapImageRetriever scripts have been created equally. The one from git was modified to do the isolation. So... I'm going to try to merge the two versions... wish me luck. - xhonziHonored GuestOkay, my merge of the latest LeapImageRetriever with the one from the Isolated Hands on git had poorer performance than this more recent fork of the file I also found on git: https://github.com/leapmotion-examples/LeapIsolatedHands/blob/master/Assets/LeapMotion/Scripts/Utils/LeapImageRetriever.cs
The problem is that they both are very slow. :( I didn't record exact measurements in ms, but anytime my hands were present in the scene, my 75 fps game slowed down to 15 fps. By comparison, the robot or human hand models don't have much impact at all.
I used deep dive profiler and it seems that the maths work in the Encode Distortion method are too cumbersome to be used. Also- the BlobFind maths are very slow when hands are present.
So... I'm abandoning this for now. Perhaps someone with more brains will be able to optimize the code.
P.S. I did rename this Isolated version of ImageRetriever to ImageIsolator so that both could coexist in the same utils folder without conflicting filenames / namespaces, and so that the difference betwixt the isolating version of the retriever could be more apparent.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 year ago
- 4 months ago
- 4 years ago
- 2 years ago