Forum Discussion

joseph99999's avatar
joseph99999
Honored Guest
27 days ago

Game-Design for post-stroke patients, need help hacking the interactions SDK

Hi everyone, I'm currently working on a Unity game designed for post stroke patients with hemiplegia undergoing motor rehabilitation. I want to use Hand Tracking to make the players grab different objects. 

Problem is : the impaired hand often has very reduced finger mobility, so grabbing motion is out of the question. So I would like to know how to :

1- Trigger fake hand motions to simulate grabbing on one hand with a simple event (also ignore actual hand motion except for position)

2- Even better : simulating a grab from a different object that i could control independently for animations etc... (Also i need this because eventually the game will have to work with a motorized rehabilitation glove that will be tracked via Passthrough because default hand tracking doesn't track him at all)

I'm currently using the basic interaction building block and sample scenes prefabs for the objects (Chess piece prefab : Touch Hand Interactable + Grabbable components)
If you have any leads on how to approach this problem, I'll be very grateful !

Joseph, student gamedev

3 Replies

  • Howdy joseph99999!

    That's an interesting concept, and based on what you're looking for we can definitely help you get started with implementing that functionality within your app. Based on what you said, we can break this out into 2 concepts:

    Concept 1: Programmatic Grab with SyntheticHand (using hand tracking).

    Concept 2: Custom Hand Data Source (with motorized glove).

    For concept 1, you could try using the SyntheticHand component in the ISDK rig, create a custom hand pose, and call HandGrabInteractor.ForceSelect from your event source. To release the pose, you can use HandGrabInteractor.ForceRelease. In this concept, the SyntheticHand is used to override finger joint rotations for visual feedback. The custom hand pose in this situation might work best for your first idea ("Trigger fake hand motions") since you mentioned that the users would have reduced hand mobility. With the custom hand pose, you could try creating a pose that would work best for the users based on their mobility levels. 

    For concept 2, you could try to create a custom class that implements the IHand interface. This allows you to get position data from the passthrough marker tracking on the glove, control the finger states programmatically based on glove sensor data, and assign your custom hand to the HandGrabInteractor instead of the default tracked hand. It's important to note, the HandGrabInteractor works with the IHand interface, not a specific implementation. Meaning you can swap in your own hand data source.

    I can link some of our documentation below that should help you get started with these concepts. I'll try to separate them into each concept the best I can, however there may be some overlap in concepts. You can find information for everything I mentioned in the links below. Hopefully this helps get you started on your vision, and if you run into any trouble please feel free to reply back. I'd be happy to help!

    General:

    Interaction SDK Overview

    Concept 1:

    Hand Grab Interactions

    Create a Hand Grab Pose

    Hand Visual

    Grabbable

    Concept 2:

    Hand Tracking Overview

    Input Data Overview

    Hand Representation

    -G

  • Thank you so much, your answer is super helpful !

    We were wondering about the tracking method for concept 2 ( tracking a motorized glove that looks like this ), what do you think would be the most reliable approach?

    To be clear, we only want to track the position and orientation, hand pose is not important.

    Is using Passthrough QR Code Tracking via a library like this one a good option ? Feels like detecting movements where the glove's QR code would not be perpendicular to the camera may be challenging. 

    Another method we considered was using an additional tracking device like a Vive tracker, but it's too heavy to be worn on the player's impaired hand. We also thought about dismantling a controller and redistributing its tracking lights around the glove to “fake” controller tracking, but that’s probably a long shot.

    Finally we could develop a custom passthrough ai to detect the glove and its orientation similar to what's shown in this video

    What do you think ?

  • Hey there!

    Great question, you're right that QR tracking would be difficult when the camera is not facing it. You also bring up a good point that the Vive tracker may be a little cumbersome for your purposes. Some developers have reported success with a multiple QR code setup, but this is still an experimental feature and you may run into some challenges. 

    Since you mentioned you only need position and orientation, and not necessarily hand pose, you might explore combining different tracking methods. IMU sensors are good at tracking orientation changes quickly without line of sight requirements. Additionally, visual markers can provide absolute position data when visible. Some projects combine these approaches to get the benefits of both.

    I hope I was able to give you some ideas for implementing your vision, and please feel free to let me know if you need any more help!

    -G

→ Find helpful resources to begin your development journey in Getting Started

→ Get the latest information about HorizonOS development in News & Announcements.

→ Access Start program mentor videos and share knowledge, tutorials, and videos in Community Resources.

→ Get support or provide help in Questions & Discussions.

→ Show off your work in What I’m Building to get feedback and find playtesters.

→ Looking for documentation?  Developer Docs

→ Looking for account support?  Support Center

→ Looking for the previous forum?  Forum Archive

→ Looking to join the Start program? Apply here.

 

Recent Discussions