cancel
Showing results for 
Search instead for 
Did you mean: 

How to enable Hand Tracking and Controllers at the same time for a Unity Project

SriAmin
Explorer

I was playing around with the Oculus Integration Package for Unity along with Hand Tracking. I was wondering if it possible to have both Hand Tracking and the Controller enabled during the execution of the project.

 

I was thinking of using hand tracking for the player to grab objects around them, and I wanted to use the controller, more specifically the location be translated in real time in the Unity environment and me, as the player, can walk up and pick the object which is my controller. Essentially, I want to use the location sensors in the controller and utilize those values within my project, while also using hand tracking.

 

I know that when I switch to hand tracking the controllers turn off and vice versa so I'm a little confuused on if its possible, I talked to a Oculus Support Staff and they said it possible, but I was wondering if anyone here has tried it and found a solution.

3 REPLIES 3

buymeacoffee
Honored Guest

Did you try changing the Hand Tracking Support in the OVRManager script to Controllers at runtime? Looks like it is an enum. 

Kendel8
Explorer

The Quest can operate in controller mode or hand tracking mode, but not both at the same time. What you want to do should be achievable, by placing controller models in the world at the last known location when tracking switches to hands. You’ll have to keep track of real world vs. VR position/orientation changes as the player moves, performs locomotion, re-centers, etc and update the in-world controller placeholders accordingly.

SriAmin
Explorer

I've somewhat was able to switch between the controls, essentially I just have an OVRHand and a OVRController prefab, I just disabled OVRHand and when I want the user to switch controls I'd just disable the OVRController prefab and enable the OVRHand prefab which essentially would switch the controls over.