03-03-2022 09:04 AM
I was playing around with the Oculus Integration Package for Unity along with Hand Tracking. I was wondering if it possible to have both Hand Tracking and the Controller enabled during the execution of the project.
I was thinking of using hand tracking for the player to grab objects around them, and I wanted to use the controller, more specifically the location be translated in real time in the Unity environment and me, as the player, can walk up and pick the object which is my controller. Essentially, I want to use the location sensors in the controller and utilize those values within my project, while also using hand tracking.
I know that when I switch to hand tracking the controllers turn off and vice versa so I'm a little confuused on if its possible, I talked to a Oculus Support Staff and they said it possible, but I was wondering if anyone here has tried it and found a solution.
Did this answer your question? If it didn’t, use our search to find other topics or create your own and other members of the community will help out.
If you need an agent to help with your Meta device, please contact our store support team here.
Having trouble with a Facebook or Instagram account? The best place to go for help with those accounts is the Facebook Help Center or the Instagram Help Center. This community can't help with those accounts.
Check out some popular posts here:
Getting Help from the Meta Quest Community
Tips and Tricks: Charging your Meta Quest Headset