10-16-2023 03:35 PM
Hi,
I have spent the last two years developing for the Quest 2, and recently got the Quest 3. It's a great device and I'm super happy with it. There is just one big problem standing between me and total developer bliss.
How is it possible that we still don't have a stable, robust hand tracking implementation across the whole development pipeline in 2023? I'm confused as to how Meta envisions developers to take full advantage of their (really good) hand tracking tech, if there are consistent inconsistencies, fumbling around, trying seven different versions of all of the little SDKs, components, etc.
Can someone please advise me on how to achieve a simple Unity scene using the standard Oculus Integration, where I can just click "Play" in the editor and get hand tracking working over my Link cable?
So far I have gone through five different Unity versions from 2021-2023, even more different Oculus Integration SDK versions (v50-57), and three different headsets (Quest 1, 2 and 3).
Nothing worked. The only way that I have managed to have hand tracking working in the editor via link is to use a long deprecated version of the Oculus Integration, where I had to manually select "Oculus -> Tools -> OVR Utilities Plugin -> Set OVRPlugin to Legacy LibOVR+VRAPI" from within Unity. Three things. First, this option is no longer available in the later Oculus Integration versions. Second, selecting this option explicitly disables the possibility of building for the Quest 2 and 3. So you'd have to switch back and forth between the old LibOVR+VRAPI and the newer OpenXR integration, just to get hand tracking working in the editor. Really? Third, we as developers cannot reasonably be expected to stick to this legacy API, as all of the newer mixed reality features, like scene understanding, spatial anchors, etc. are not supported in the old version. Hence I want to ask my question one more time: how does Meta expect people to develop for their platform?
Please let me know if you have an answer to this dilemma, I am grateful for any pointers!
Note: I am explicitly talking about hand tracking through the Unity Editor using Link, in the standalone Android builds it works fine and it's amazing to use!
Note 2: This is not a new problem, it's documented across many forum posts both here and on the Unity forums. I have also found fully-fledged projects, that warn of this exact error, e.g. (https://github.com/kuff/medslr#how-to-launch-the-project).
Note 3 - further information: The legacy backend of OVRPlugin was removed in v51, which is stated here: https://developer.oculus.com/downloads/package/unity-integration/50.0 under "OVRPlugin".
Best,
Max
Solved! Go to Solution.
10-17-2023 05:55 PM
I found a solution. It was my own fault for not reading the guides properly. Did a fresh install of Windows 10 on my machine and it worked, that's great! @meta, please make this a bigger note on your guides so that people aren't left scratching their heads. Also, as of Sept 2023 Windows 11 has 23% market share, more and more people are using it. It would be great to see support for it from your side.
I hope that the information I gathered will be useful for someone dealing with that.
Best,
Max
10-17-2023 09:20 AM
Some more information. I collected the quest logs through MQDH when running the same scene with the OVRPlugin set to use OpenXR (hand tracking doesn't work) and the legacy LIBVR API (hand tracking works). I have attached the screenshots here. Notice how when using OpenXR, the ClientInputSettings emits a message "HANDTRACKING: enabled input settings, setting hand mode control settings to ControllersOnly". I believe that this might be the underlying issue. Note that I have set my OVR integration in Unity to "Hands Only", so Unity has nothing to do with that message. It seems that when using the OpenXR backend, there is some call somewhere that explicitly sets the control mode to "ControllersOnly", which then disables hand tracking. Pleaselet me know if you require further information.
OpenXR (handtracking doesn't work)
LIBVR (handtracking works)
10-17-2023 05:55 PM
I found a solution. It was my own fault for not reading the guides properly. Did a fresh install of Windows 10 on my machine and it worked, that's great! @meta, please make this a bigger note on your guides so that people aren't left scratching their heads. Also, as of Sept 2023 Windows 11 has 23% market share, more and more people are using it. It would be great to see support for it from your side.
I hope that the information I gathered will be useful for someone dealing with that.
Best,
Max
10-22-2023 01:06 PM
Does this mean that Windows 10 does not work?
10-23-2023 04:29 AM
No, the other way around - Windows 11 doesn't work, Windows 10 works.
11-04-2023 11:01 AM
hand tracking not working in UE editor too. I have to completely install a different operating system just to get hands working over link on the pc? Could you link to the exact document where it says 11 wont work? Thanks
01-12-2024 06:54 AM
I need to add to my previous response. After some time, hand tracking in the Unity editor stopped working again for me (under Windows 10). This time it was a completely different issue.
At some point I installed a different piece of software that came with its own OpenXR backend.
Apparently installing that software overwrote a key in the registry, namely in the folder "Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Khronos\OpenXR\1\ApiLayers\Implicit"
pointing it it to its own OpenXR JSON configuration. So if your hand tracking does not work in the Unity editor for the most obscure reason and you've really checked everything else make sure that in the Windows registry, under
05-29-2024 12:40 PM
Thank you so much for this 🙏
07-29-2024 03:31 AM
Thank you from the bottom of my heart for your very detailed post with this solution. We struggled with this for a year, and it now turns out that another hand tracking solution indeed overwrote the Oculus windows registry key. Changed this variable and poof, hands are rendered again from Unity editor. Absolutely amazing!