cancel
Showing results for 
Search instead for 
Did you mean: 

Set up controller driven hands v65

vitaly_zu
Honored Guest

Hello! Can you please tell me how to install hand controllers on Quest 2. Adding OVRControllerDrivenHands component to OVRInteractionComprehensive did not help.doc  Hands are not displayed in the examples either. I tried in versions 65 and 62

30 REPLIES 30

maurozen.2024
Explorer

Hello, i'm just exploring that functions too right now. These are my suggestions (after days of real disperation using SDK v65):

1) Start a brand new  project  editor 2022.3.29, (3D built-in core), do not choose VR or MR presets

2) Download the META XR ALL IN ONE package, but don't install the V65 version, but choose history->version V64

3) Download the META XR INTERACTION SDK SAMPLES (important install always the V64 version)

4) import now samples from Meta MR utility Kit or Meta MR interaction samples. Even the first one BASE sample have a complete hands working example on QUEST 2.

Instead other combinations using the V65 or XR setup or starting with MR or AR preset have different issues

Good luck

 

mcgeezax4
Protege

I was trying to follow that guide too and it wasn't working for me either, so I just used the deprecated OVRControllerHands instead and that worked.  I hope to switch over to the new OVRControllerDrivenHands once they fix it.

giorgos.ganias
Honored Guest

Hi. In my case the Hands are not shown either when i use controllers with OVRControllerDrivenHands. However when i build for Android i can see my hands inside the headset. Its pretty frustrating but it works in the .apk

mcgeezax4
Protege

Just got v66 of the SDK for Unity and it's still broken.

I understand it works when making a full build, but if I cant use it to debug stuff in play mode then it's basically unusable for anything other than a quick tech demo or prototype.  I wish they could somehow indicate whether they are even aware of the issue or if they ever plan to fix it, otherwise I'd rather cut my losses with the Meta SDK and switch to the XR Interaction Toolkit.

hvox1138
Protege

Hi all! I'm an engineer on the Interaction SDK team. We're sorry to hear this is still an issue for you; we're working to improve our docs to help this not be a problem going forward.

In the meantime, can you confirm that the "Settings > Beta > Developer Runtime Features" setting is enabled in your Meta Quest Link app on your development PC? If not, does enabling it fix the issue for you?

Yes I have Developer Runtime Features enabled and I have this problem.

Okay, good, thanks for confirming! Next quick thing to try (hoping to narrow down what the issue might be): if you import the samples from the Interaction SDK Package and open the "Gesture Examples" scene, controller-driven hands should be enabled by default; does that scene work correctly, or do you still experience the issues you've been seeing in other scenes?

I tried Gesture Examples and it has the same problem, it does not show the controller-driven hands in Unity play mode.  It only shows them when making a build and running off of the device.  

The deprecated OVRControllerHands was pulling controller input, then converting it into hand input, then having the synthetic hand pull from that, and that worked in Unity play mode.  This new OVRControllerDrivenHands looks like its tapping into some deeper functionality outside of the SDK, where you set the Controller Driven Hand Poses property on the OVRManager and then upon initialization that will call an outside DLL to enable and configure the feature.  But for whatever reason that process does not appear to be working when running from Unity over the link.

Okay, thanks again! Sorry if this reply is a bit of an info dump, but to reduce back-and-forth I'm just going to drop a lot of information about what's currently working for me, so that we can hopefully sift through more quickly and figure out where the difference is coming from.
 
This morning I created a completely clean project (Unity 2022.3.9f1), installed Interaction SDK (v68), and was able to see controller-driven hands in Link (v68.0.0.515.361). The steps I followed were as follows:
  1.  Create the Unity project
  2. Switch build target to Android
  3. Install Meta XR All-in-One
  4. Install Unity XR support
  5. Enable Oculus XR plugins for both Android and Windows
  6. Run the Meta > Tools > Project Setup Tool and apply all recommended fixes
  7. Install the Interaction SDK Example Scenes
  8. Open the GestureExamples scene
  9. Start Link (with cable) from Quest 3
  10. While holding controllers, press Play in Unity
On my (Windows) machine, these steps resulted in controller-driven hands working over Link. My Quest device is on a pre-release OS build, which I don't think should affect this behavior, but if you suspect that's an issue just let me know your OS version and I'll be happy to test again to be sure.
 
Apart from that, one of my colleagues also put together a short list of things that have fixed controller-driven hands problems in the past. I think these are all already checked or not applicable here, especially in the example scenes, but in the interest of thoroughness here's the full list:
  •  In the OVRManager, ensure that the "Controller Driven Hands" setting is set to "Conforming to Controller" or "Natural."
  • In Unity's XR Plug-in Management, ensure the Oculus plug-in provider is enabled for both Android and Windows.
  • In PC Link, ensure that "Developer Runtime Features" are enabled.
  • In the OVRHand scripts on the OVRHandPrefab objects nested under "Left/RightHandAnchor," ensure that "Show State" is set to either "Always" or "Controller In Hand or No Hand."

Do any of the things described here seem different between my setup and yours? If not, we might have to dive deeper to figure out why it behaves differently on my system versus yours. Thanks again for your patience in helping us delve into this!