05-23-2024 07:34 AM - edited 05-23-2024 07:46 AM
Hello! Can you please tell me how to install hand controllers on Quest 2. Adding OVRControllerDrivenHands component to OVRInteractionComprehensive did not help.doc Hands are not displayed in the examples either. I tried in versions 65 and 62
05-25-2024 01:40 AM
Hello, i'm just exploring that functions too right now. These are my suggestions (after days of real disperation using SDK v65):
1) Start a brand new project editor 2022.3.29, (3D built-in core), do not choose VR or MR presets
2) Download the META XR ALL IN ONE package, but don't install the V65 version, but choose history->version V64
3) Download the META XR INTERACTION SDK SAMPLES (important install always the V64 version)
4) import now samples from Meta MR utility Kit or Meta MR interaction samples. Even the first one BASE sample have a complete hands working example on QUEST 2.
Instead other combinations using the V65 or XR setup or starting with MR or AR preset have different issues
Good luck
05-28-2024 10:31 AM
I was trying to follow that guide too and it wasn't working for me either, so I just used the deprecated OVRControllerHands instead and that worked. I hope to switch over to the new OVRControllerDrivenHands once they fix it.
06-25-2024 08:00 AM
Hi. In my case the Hands are not shown either when i use controllers with OVRControllerDrivenHands. However when i build for Android i can see my hands inside the headset. Its pretty frustrating but it works in the .apk
06-25-2024 09:05 PM
Just got v66 of the SDK for Unity and it's still broken.
I understand it works when making a full build, but if I cant use it to debug stuff in play mode then it's basically unusable for anything other than a quick tech demo or prototype. I wish they could somehow indicate whether they are even aware of the issue or if they ever plan to fix it, otherwise I'd rather cut my losses with the Meta SDK and switch to the XR Interaction Toolkit.
08-22-2024 03:08 PM
Hi all! I'm an engineer on the Interaction SDK team. We're sorry to hear this is still an issue for you; we're working to improve our docs to help this not be a problem going forward.
In the meantime, can you confirm that the "Settings > Beta > Developer Runtime Features" setting is enabled in your Meta Quest Link app on your development PC? If not, does enabling it fix the issue for you?
08-22-2024 11:10 PM
Yes I have Developer Runtime Features enabled and I have this problem.
08-23-2024 07:31 AM
Okay, good, thanks for confirming! Next quick thing to try (hoping to narrow down what the issue might be): if you import the samples from the Interaction SDK Package and open the "Gesture Examples" scene, controller-driven hands should be enabled by default; does that scene work correctly, or do you still experience the issues you've been seeing in other scenes?
08-25-2024 07:18 PM - edited 08-26-2024 10:13 AM
I tried Gesture Examples and it has the same problem, it does not show the controller-driven hands in Unity play mode. It only shows them when making a build and running off of the device.
The deprecated OVRControllerHands was pulling controller input, then converting it into hand input, then having the synthetic hand pull from that, and that worked in Unity play mode. This new OVRControllerDrivenHands looks like its tapping into some deeper functionality outside of the SDK, where you set the Controller Driven Hand Poses property on the OVRManager and then upon initialization that will call an outside DLL to enable and configure the feature. But for whatever reason that process does not appear to be working when running from Unity over the link.
08-26-2024 10:46 AM - edited 08-26-2024 10:48 AM
Do any of the things described here seem different between my setup and yours? If not, we might have to dive deeper to figure out why it behaves differently on my system versus yours. Thanks again for your patience in helping us delve into this!