Forum Discussion
Anonymous
6 years agoHand Tracking in Editor
Has anyone got the hand tracking demos to work in editor?
Is this possible or do we currently need to build an APK in order to get this to work?
Is this possible or do we currently need to build an APK in order to get this to work?
16 Replies
Replies have been turned off for this discussion
- eco_editorHonored GuestDo you have the hand tracking demo for unity? Can you please post a link? THANKS!
- AnonymousIf you download the latest update of the Oculus Integration on the asset store, you can find the example scenes in the sampleframework folder.
The 'HandsInteractionTrainScene' should be in there - HanabearProtegePublish and run the train scene on my Quest, but the scene asked to switch it to controller interaction, otherwise it just close the demo. Any idea?
- Anonymousi have the same problem as Hanabear, it wants to switch back to controllers but i dont see my hands in the demo then of course ...
edit: ok i just found out why: you simply need to click the OvrCameraRig in the Hirarchy and there is "Hand Tracking Support"
change it to "controllers and hands"
(and of course the handtracking of the Quest must be active) - Anonymousif you transform the position of the whole scene just by a little bit, you will have a very creepy expirience :D
- dilmervMeta Employee
Maddy25 said:
i have the same problem as Hanabear, it wants to switch back to controllers but i dont see my hands in the demo then of course ...
edit: ok i just found out why: you simply need to click the OvrCameraRig in the Hirarchy and there is "Hand Tracking Support"
change it to "controllers and hands"
(and of course the handtracking of the Quest must be active)
Thanks a lot for your comment, I did exactly what you mentioned but every time I build it to the device my hands do not show. If I generate an AndroidManifest.XML to request hand permissions I get the following error in "OculusBuildProcessor.cs"if (attrib.Value == firstValue){var valueSibling = attrib.NextSibling;valueSibling.Value = secondValue;
}
To overcome this I removed the line below and the VR demo builds but I don't see my hands:
<meta-data android:name="com.samsung.android.vr.application.mode" android:value="vr_only"/>
I appreciate your help thanks
edit - Ok I finnally figured it out, for now I changed the OculusBuildProcessor.cs to check if valueSibling was null before accessing valueSibling.Value that fixes the null exception. After that I changed my bundle identifier completely to something different and changed the "Hand Tracking Support" as stated by the previous post.
I also ended up making a video in YouTube under dilmervalecillos channel as a reference with all steps so everyone is welcome.
Thanks everyone for your help. - josecatenaHonored GuestI have managed published the TrainDemoScene on my Quest on unity 2019.3(Beta) and works fine.Does anybody know if is it posible to use it with the new XR Tool Kit since Oculus Android package is going to be removed in unity 2020.x?
- AnonymousI was able to get it to run without error on MacOS by doing what Maddy25 suggested; look in OvrCameraRig and change the "Hand Tracking Support" option to "controllers and hands". Messing with the AndroidManfiest.xml file always causes errors for me
If you want to use more than just your index finger to interact with the buttons, click on InteractableToolsSDKDriver in the hierarchy, change the sizes of the Left Hand Tools and Right Hand Tools arrays to 6, and add the other FingerTipPokeTool prefabs from the folder: Assets>Oculus>SampleFramework>Core>HandsInteraction>Prefabs
If anyone can figure out how out how to find/use the transform of the hands (I'm looking for a centralized location on each hand) that would be greatly appreciated - Anonymous
wlewis55 said:
I was able to get it to run without error on MacOS by doing what Maddy25 suggested; look in OvrCameraRig and change the "Hand Tracking Support" option to "controllers and hands". Messing with the AndroidManfiest.xml file always causes errors for me
If you want to use more than just your index finger to interact with the buttons, click on InteractableToolsSDKDriver in the hierarchy, change the sizes of the Left Hand Tools and Right Hand Tools arrays to 6, and add the other FingerTipPokeTool prefabs from the folder: Assets>Oculus>SampleFramework>Core>HandsInteraction>Prefabs
If anyone can figure out how out how to find/use the transform of the hands (I'm looking for a centralized location on each hand) that would be greatly appreciated
use LeftHandAnchor/RightHandAnchor? or you can also use one of the hand bone transform by GetComponent<OVRSkeleton>().Bones[index].transform. If you want to use one of the bones, then you may like Bone[9] which gives the root of a middle finger. - deprecatedcoderHonored GuestI don't think any of these responses actually answer the original question, which was whether hand tracking would work in the editor; I assume meaning with Oculus Link and while in Play Mode.Unfortunately it seems the answer to that right now is no, unless hopefully someone can demonstrate otherwise. It would really be useful.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 3 years ago