Showing results for 
Search instead for 
Did you mean: 

SOLVED: Hand Tracking not working in Unity Editor or Windows PC Build

Honored Guest

EDIT: This was solved in the v62 update! 

I have been attempting to use Hand Tracking over either Air Link or Quest Link for use in a Windows PC build. After setting up a project and playing a sample scene, the tracked hands are not visible. Hand Tracking is working on the device in the Quest OS and during Quest Link. When the unity app is running and my palms are facing the headset the oculus menu buttons are visible but not the hand mesh.


Steps to reproduce:

  1. Create new Unity project.
  2. Install Oculus Integration and XR Plugin Management (select Oculus as Provider)
  3. Open any Hand Tracking supported Scene (I mainly am interested in the Interaction SDK)
  4. Hands will not be visible. Depending on the scene, the hand mesh can be seen not moving from it's initial position.


Tested on multiple computers (Windows 10 & 11), and multiple devices (Quest 2 & Pro). Both Quest devices are on v47. I have tested this with Oculus Integration v46 and v47.


Honored Guest

I also struggled with the same issue for a while, but using the packages distributed by Unity turned out to be the easiest solution.

Honored Guest

Hi @drawlewis, these steps might help:

  1. Ensure that under ProjectSettings -> XR Plugin Management -> "Oculus" is selected.
  2. From the menu, select Oculus -> Tools -> Project Setup Tool and then select "Apply All" from the recommended settings. 

Can you let me know if that solved it for you? It seems like you're not the only one that reported this. Also recommend following all of the steps here:

Hi. Can you provide the step by step process on how did you apply the XR Hands? I have also the same issue

Can you explain more specifically? I can't see my hands when playing in the unity editor. Please help bro...

Hi @willmanning , thank you for providing your solution here. But unfortunately it didn't work for me...