cancel
Showing results for 
Search instead for 
Did you mean: 

Proper way to replicate Oculus Quest hand tracking

SpicyDev
Honored Guest

Hello,

iam on Unreal 4.26 (Launcher version) and iam trying to replicate the hand tracking for an multiplayer experience. My pawn has a body and two motioncontrollers which are being replicated nicely. Now I added the Oculus hand components, for the left and right hand and it works only for my locally controlled pawn. The other players can't see my gestures, the only thing being replicated is the hands position but not the animations/gestures. Checking "Component Replicates" didn't help at all. Any ideas?
 
My pawn:

0yvm5pq8j4me.png

3 REPLIES 3

Mystfit
Protege
I've run into this issue myself and gotten around it in a sort-of roundabout way. The hand components don't replicate joint positions across the network as the OculusHandComponent is designed to only represent a hand attached to your current headset. 

One solution could be as follows (This method will only work with a custom hand mesh, not the built in one that the OculusHandComponent fetches from the Oculus runtime):
  1. The only pawn that should have an OculusHandComponent is the autonomous proxy pawn. Make sure the server and the simulated proxy pawns only have PoseableMeshComponents instead. You'll need to add these components dynamically during BeginPlay.
  2. Using a looping timer, iterate over each joint in the OculusHandComponent and save the rotations into an array of FVector_NetQuantitize10 structs. Avoid sending a whole pose every frame as this will throttle your network.
  3. Replicate the array using your favorite method. I use a 'Run On Server' custom event which then multicasts the array out to all other clients.
  4. Apply the rotations to a PoseableMesh component on the other clients. You might have to correct for some wrist rotation offsets that the OculusHandComponent likes to add.
For my project, I've worked around this a bit differently.
  • Local client: I attach my OculusHandComponent on the autonomous pawn, then copy the pose onto a SkeletalMeshComponent using a modified CopyPoseFromMesh in an animation blueprint. I've had to modify the copy node to correct for different joint names allow for copying poses from PoseableMeshes instead of just SkeletalMeshes. This lets me utilize animation blueprints to do some fun things with the hand, like using the pose driver to detect gestures, or the physical animation system to drive a ragdoll using my hands.
  • Local client: Use the Snapshot Pose node to capture a FPoseSnapshot struct.
  • Local client: Iterate over the local transforms in the pose snapshot to convert the bone rotations to FVector_NetQuantitize10 structs.
  • Local client: Replicate array of vectors to the server.
  • Server: Multicast array of vectors to clients.
  • Remote client: Using a captured pose snapshot taken during BeginPlay, iterate over the bones and apply the received rotations.
  • Remote client: Apply the new pose snapshot to our SkeletalMeshComponent on our simulated proxy pawn. I smoothly blend between each received pose by flip-flopping between the last pose and the new pose using a "Blend poses by int" node in the animation graph.
A bit later on, I'm going to clean up my code further and look into releasing the hand portion a plugin to try and simplify this process.

Hey! I'm very interested in having a similar setup.
Have you been able to launch any tutorial/demo/plugin or is there any other way for me to have a look at your code?
Awesome work mate!

Many thanks!

Thank you for sharing.

Before I take a swing at this maybe you can share your simplified process?