12-01-2022 08:38 PM
Hi,
I have a Oculus Quest Pro, and work on a Unity project that needs hand tracking and controller tracking for a physical object, but I can't enable hand and controller tracking at the same time. So I wonder is this possible? or is there any other ways to track a physical object using Oculus?
06-28-2023 09:24 AM
Nope
10-10-2023 06:10 AM
Is this feature now available in the new Quest 3?Does someone know that?
11-28-2023 02:34 AM
Well the Unity SDK hasn't changed
02-15-2024 07:56 AM
Hi there! As far as I understood, the current hardware (cameras) need to change their exposure modes depending on what they are tracking, which is why simultaneous positional tracking of hands and controller is not supported (correct me if I'm wrong pls)
But is there a way that prevents the Quest from switching to controller tracking on Button inputs in general?
In my setup I want to be able to use the controller as a remote to control what the hand-tracked user is seeing, kind of like a showmaster. I was wondering, if this is an issue with the SDK or if I am doing something wrong.
I use Unity, OpenXR, the Hand Tracking Rig with disabled/removed XR Input Modality Controller Driver as well as deactivated Input Tracking. Yet whenever I press a button on my controllers, the hands stop being tracked.
11-02-2024 06:04 PM
https://www.youtube.com/watch?v=EFlEnfOG6FU
I tested this. is working.
(but not working if you use full bodytraking)
11-03-2024 10:02 AM - edited 11-03-2024 10:13 AM
Not sure if all you need is the MultiModal feature (https://developers.meta.com/horizon/documentation/unity/unity-multimodal/#setup) but I noticed in another thread that the documentation on this hasn't been updated see: https://developers.meta.com/horizon/documentation/unity/unity-multimodal/#setup