cancel
Showing results for 
Search instead for 
Did you mean: 

SOLVED: Hand Tracking not working in Unity Editor or Windows PC Build

wlewisJHU
Honored Guest

EDIT: This was solved in the v62 update! https://communityforums.atmeta.com/t5/Announcements/Meta-Quest-build-62-0-release-notes/ba-p/1145169 

I have been attempting to use Hand Tracking over either Air Link or Quest Link for use in a Windows PC build. After setting up a project and playing a sample scene, the tracked hands are not visible. Hand Tracking is working on the device in the Quest OS and during Quest Link. When the unity app is running and my palms are facing the headset the oculus menu buttons are visible but not the hand mesh.

 

Steps to reproduce:

  1. Create new Unity project.
  2. Install Oculus Integration and XR Plugin Management (select Oculus as Provider)
  3. Open any Hand Tracking supported Scene (I mainly am interested in the Interaction SDK)
  4. Hands will not be visible. Depending on the scene, the hand mesh can be seen not moving from it's initial position.

 

Tested on multiple computers (Windows 10 & 11), and multiple devices (Quest 2 & Pro). Both Quest devices are on v47. I have tested this with Oculus Integration v46 and v47.

15 REPLIES 15

roydmagnuson
Protege

Same! Any luck on this? 

meek0ception
Explorer

Hi,

The oculus menu button being visible is not an indication of hand tracking successfully working inside your unity app, instead its kind of like a system wide thing your device is doing on its own. 

The OpenXR backend is known to cause issues with hand tracking, try using the legacy backend and see if it works then. Head over to the Oculus tab: Oculus>Tools>OVR Utilities Plugin>Set OVRPlugin to Legacy LibOVR+VRAPI. 

You may also want to make sure that hand tracking support is enabled inside the OVRManager of your OVR rig:

OvrManager component script > Quest features > Set handtracking support to controllers and hands.

Let me know if this helps, Good Luck!

AWFlat
Honored Guest

As far as I know, handtracking is only supported on standalone builds as of now. Unless something has changed very recently.

I had the same problem as everyone here. Going back to the legacy backend worked. Not ideal, but at least I can see my hands again !

MakamiCollege
Honored Guest

Same here! Reverting to the legacy backend worked but it's certainly not an ideal solution. Really sucks that when you google the issue, a ton of tutorial videos from 2020/2021 show up clearly showing hand-tracking working in the Unity editor over oculus link, but suddenly doesn't work whatsoever with the OpenXR backend enabled.

The hand-tracking documentation even explicitly states that "We support the use of hand tracking on Windows through the Unity editor, when using Meta Quest headset and Meta Quest Link."

Set Up Hand Tracking | Oculus Developers

keli95566
Explorer

This issue is still there with SDK v53.2

And in this SDK version, there is no way to revert to the legacy OVR plugin...

Does anyone with a solution please? It is a real struggle to develop PCVR app 

Same here. I recreated the project, reinstalled Unity, reinstalled Oculus App and Visual Studio / its dependencies but no luck. No error message whatsoever and the hands just don't show. 

Finally I switched the PC and it started to work again. The fact I don't know the reason makes me fear that it might stop working anytime soon. Truly unstable and as a developer I can't say I trust this platform. 

Fahruz
Expert Protege

Same for me. I can see the hands when building and running the app on the headset, but not when running it from the Unity editor via Quest Link.

sonostrano
Honored Guest

After banging my head against it for months, I almost accidentally found a solution that works (at least for me):
You need to assign OVRHands to a Layer in Inspector.
Hope that helps