10-16-2023 01:26 PM
Hi,
I have spent the last two years developing for the Quest 2, and recently got the Quest 3. It's a great device and I'm super happy with it. There is just one big problem standing between me and total developer bliss.
How is it possible that we still don't have a stable, robust hand tracking implementation across the whole development pipeline in 2023? I'm confused as to how Meta envisions developers to take full advantage of their (really good) hand tracking tech, if there are consistent inconsistencies, fumbling around, trying seven different versions of all of the little SDKs, components, etc.
Can someone please advise me on how to achieve a simple Unity scene using the standard Oculus Integration, where I can just click "Play" in the editor and get hand tracking working over my Link cable?
So far I have gone through five different Unity versions from 2021-2023, even more different Oculus Integration SDK versions (v50-57), and three different headsets (Quest 1, 2 and 3).
Nothing worked. The only way that I have managed to have hand tracking working in the editor via link is to use a long deprecated version of the Oculus Integration, where I had to manually select "Oculus -> Tools -> OVR Utilities Plugin -> Set OVRPlugin to Legacy LibOVR+VRAPI" from within Unity. Three things. First, this option is no longer available in the later Oculus Integration versions. Second, selecting this option explicitly disables the possibility of building for the Quest 2 and 3. So you'd have to switch back and forth between the old LibOVR+VRAPI and the newer OpenXR integration, just to get hand tracking working in the editor. Really? Third, we as developers cannot reasonably be expected to stick to this legacy API, as all of the newer mixed reality features, like scene understanding, spatial anchors, etc. are not supported in the old version. Hence I want to ask my question one more time: how does Meta expect people to develop for their platform?
Please let me know if you have an answer to this dilemma, I am grateful for any pointers!
Note: I am explicitly talking about hand tracking through the Unity Editor using Link, in the standalone Android builds it works fine and it's amazing to use!
Best,
Max
Solved! Go to Solution.
10-17-2023 05:54 PM
I found a solution. It was my own fault for not reading the guides properly. Did a fresh install of Windows 10 on my machine and it worked, that's great! @meta, please make this a bigger note on your guides so that people aren't left scratching their heads. Also, as of Sept 2023 Windows 11 has 23% market share, more and more people are using it. It would be great to see support for it from your side.
Best,
Max
10-16-2023 01:43 PM
Note: This is not a new problem, it's documented across many forum posts both here and on the Unity forums. I have also found fully-fledged projects, that warn of this exact error, e.g. (https://github.com/kuff/medslr#how-to-launch-the-project).
10-16-2023 01:49 PM
Further information: The legacy backend of OVRPlugin was removed in v51, which is stated here: https://developer.oculus.com/downloads/package/unity-integration/50.0 under "OVRPlugin".
10-17-2023 09:21 AM
Some more information. I collected the quest logs through MQDH when running the same scene with the OVRPlugin set to use OpenXR (hand tracking doesn't work) and the legacy LIBVR API (hand tracking works). I have attached the screenshots here. Notice how when using OpenXR, the ClientInputSettings emits a message "HANDTRACKING: enabled input settings, setting hand mode control settings to ControllersOnly". I believe that this might be the underlying issue. Note that I have set my OVR integration in Unity to "Hands Only", so Unity has nothing to do with that message. It seems that when using the OpenXR backend, there is some call somewhere that explicitly sets the control mode to "ControllersOnly", which then disables hand tracking. Please let me know if you require further information.
OpenXR (handtracking doesn't work)
LIBVR (handtracking works)
10-17-2023 05:54 PM
I found a solution. It was my own fault for not reading the guides properly. Did a fresh install of Windows 10 on my machine and it worked, that's great! @meta, please make this a bigger note on your guides so that people aren't left scratching their heads. Also, as of Sept 2023 Windows 11 has 23% market share, more and more people are using it. It would be great to see support for it from your side.
Best,
Max
10-18-2023 03:35 PM
Thanks for posting this I have two Pcs both on windows 11 I am going to try to revert/format my home pc and see if this works, I am seriously banging my had against the wall not having hands working
10-19-2023 05:54 AM
Hi maxgraf96,
Thanks for providing the solution, but could you please describe more about the guides?
Another question is that you actually reinstall your Win10 on your computer?
Thanks for answering!
10-19-2023 08:17 AM - edited 10-19-2023 08:18 AM
Hi shihyuma,
I'm talking about this page (https://developer.oculus.com/documentation/unity/book-unity-gsg/?locale=en_GB), which states Windows 10 as a requirement, but for lazy eyes that could also read as Windows 10 or higher. Imo it would be better to explicitly warn people that Windows 11 does not support certain features like hand tracking via link, especially considering that many new consumer PCs and laptops are now shipped with Windows 11 by default.
Best,
Max
Edit: Yes, I did a fresh install of Windows 10 on my machine to check if it would work.
10-19-2023 08:34 AM
Thanks so much trying this now
10-23-2023 08:27 AM - edited 10-23-2023 08:53 AM
So, if I want proper Link hand tracking, I either have to downgrade or dual boot Windows 10.
I can do the latter more easily, but even then, the effort isn't trivial.
Perhaps a handful of developers can brush it off; yet I can't help but think about how counterintuitive this solution sounds to me. Maybe Meta should fix the SDK to be more compatible with Windows 11?
(Maybe they should fix the entire Oculus desktop experience.)