cancel
Showing results for 
Search instead for 
Did you mean: 

Any link to a FUNCTIONING example of a Unity project that ACTUALLY switches hands to controllers?

DHARMAKAYA
Protege

Hi, 

 

Any here can post a link to a Unity project where the OVRPlayerController or similar ACTUALLY PROPERLY switches from controllers (with hands on the controllers animated like in oculus home) to hand tracking and back? 

 

Already tried many different ways to make that function the past 2 or 3 days. 

 

Thanks in advance for posting a link to a project that actually functions properly that way in Unity 2020 lts (preferably, although 2019 lts is tolerable also). 

2 REPLIES 2

Anonymous
Not applicable

The "hands on controller" rendering is provided by the Oculus Avatar framework and the "hand tracking" rendering is provided by core "VR" framework. The avatar stuff has been in a weird state forever. Very few applications use it, and even fewer Quest apps use it. This is especially true now as Oculus is transitioning to their new avatar system.

(https://www.theverge.com/2021/4/23/22398060/oculus-new-avatars-editor-features-vr-virtual-reality-fa...). It would take me a while to experiment to get that working, especially since the avatar framework doesn't have the concept of controllers being idle (hidden).

 

What I have gotten to work is using the OVRHand prefabs along with the OVRControllerPrefab. I believe that works out of the box. The controllers appear when you're touching the controllers, put down the controllers for a few seconds and the hands will appear. (assuming your OVRManager is set to controllers and hands and the Quest is set to auto switch between hands and controllers in device settings.)

 

If anyone has gotten hand tracking to work in concert with Oculus avatar rendering it would probably be the Microsoft Mixed Reality Toolkit. (https://docs.microsoft.com/en-us/windows/mixed-reality/develop/unity/mrtk-getting-started)

 

If you want hands to render while you're holding the controller or using hand tracking you could look at alternative implementations like the open source Hand Posing demo: https://github.com/MephestoKhaan/HandPosing_demo 

or the Virtual Reality Interaction Framework (https://assetstore.unity.com/packages/templates/systems/vr-interaction-framework-161066)

DHARMAKAYA
Protege

Thanks for replying and for that info. Added a localavatar and got that working with an ovrcamerarig...and also got the left and right customhand prefabs to function when added to an OVRPlayerController for use with hand tracking...but NOT BOTH the controller and hand tracking switching back and forth as those should somehow do. 

 

When doing the custom hands without the avatar on an OVRPlayerController, parented hand to hand anchor and controller to controller anchor...but the hand mesh still shows (static and very incorrectly) when a controller is picked up. 

 

Recommendations? Thanks in advance!