cancel
Showing results for 
Search instead for 
Did you mean: 

Custom hands following Touch controllers

Glightgames
Protege
Hey everyone,

How are people making their own hands appear as touch controllers?
I'm unable to find any example scripts or scenes for the Touch within Unity, so have just given it a go after looking at the documentation.

I was hoping something like this would work:
m_LeftHand.transform.position = OVRInput.GetLocalControllerPosition(OVRInput.Controller.LTouch);

It has, but in a specific area not relative to the Rift headset. As in I'm standing at one end of the room and my hands are at the other end.

If you could shed light on manipulating hands following the touch controllers, I'd appreciate it.
2 ACCEPTED SOLUTIONS

Accepted Solutions

MikeF
Trustee
Easiest way to have custom hands is by syncing their position with the left/right hand anchor under the ovrRig supplied with the utilities pack.

All the animations are controlled through layers via the hands animation controller.  Capacitive inputs are either 1 or 0, so fingerpoint and thumbs up are just additive layers. Triggers are analog and have a float range of 0->1. The grip trigger blends the grip animation layer from open palm to fist over that range. 

As far as i've seen there's no direct reference to the skeleton, and def. no IK as this wouldnt apply here. If you open up the .fbx in a 3d package you should be able to see the animations. So if you want a custom hand, just create a rig similar to the one supplied with the examples, set up your own animations then create a controller that behaves in a similar way to the examples.

View solution in original post

cybereality
Grand Champion
Try changing "Tracking Origin Type" of OVRManager to "Eye Level" or "Floor Level". Eye level is mostly for seated games and Floor level is mostly for standing games.
AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV

View solution in original post

20 REPLIES 20

Glightgames
Protege
Following this up I've now come across this demo:
http://puu.sh/q9Lnm/6aa7ae9cbe.jpg

In an attempt at tearing it apart to create our own hands, I can see animation and layers for grabbing, pointing and thumbs up.

Am I right in thinking that the demo uses these layers to switch between the animations and isn't referencing the skeleton or IK poses in any way?

Any feedback on this would be great, I feel like I'm getting somewhere now but have lots of questions and it's very hard to find such simple answers. :neutral:

SkinnyJeanAssas
Honored Guest
Was really hoping this would have a reply by now.  :'(

MikeF
Trustee
Easiest way to have custom hands is by syncing their position with the left/right hand anchor under the ovrRig supplied with the utilities pack.

All the animations are controlled through layers via the hands animation controller.  Capacitive inputs are either 1 or 0, so fingerpoint and thumbs up are just additive layers. Triggers are analog and have a float range of 0->1. The grip trigger blends the grip animation layer from open palm to fist over that range. 

As far as i've seen there's no direct reference to the skeleton, and def. no IK as this wouldnt apply here. If you open up the .fbx in a 3d package you should be able to see the animations. So if you want a custom hand, just create a rig similar to the one supplied with the examples, set up your own animations then create a controller that behaves in a similar way to the examples.

Glightgames
Protege
Thank you so much for the very detailed explanation Mike. I've been looking over the hands and can see everything you've explained so our hands are being created now.

Look forward to showing you what we create! 🙂

MikeF
Trustee
No worries, let me know if you get lost. Glad to help

cybereality
Grand Champion
The headset position and controller positions should be in the same coordinate space. However, if the objects are children of different objects, then things can get confusing. You should try making sure both hand models and camera are children of the same object.
AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV

Glightgames
Protege
Thank you Cyber, regarding scene startup I've noticed the Rift is picking camera height based on the actual position IRL.

Is there a way to disable this or define how tall the character is in game?

Thanks.

cybereality
Grand Champion
Try changing "Tracking Origin Type" of OVRManager to "Eye Level" or "Floor Level". Eye level is mostly for seated games and Floor level is mostly for standing games.
AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV

pjenness
Rising Star


Try changing "Tracking Origin Type" of OVRManager to "Eye Level" or "Floor Level". Eye level is mostly for seated games and Floor level is mostly for standing games.


Is unity/OVR happy to switch this toggle in game?  So if you play seated, or want to sit down after standing  you can.
My experience has combination of seated and walking/standing so switching in could be useful. Would it require a reset view each time ?

Cheers

-P
Drift VFX Visual, Virtual , Vertical Want 970GTX on Macbook for good FPS? https://forums.oculus.com/viewtopic.php?f=26&t=17349