cancel
Showing results for 
Search instead for 
Did you mean: 

Need Support: Replicating Full Body Tracking

Aeromixx
Expert Protege

Ok, so Full Body Tracking in Multiplayer is a huge must.

I have been attempting to crack it, and I have, but with a major caveat. It lags hardcore because it's flooding the network. (it's a lot of data)

I will share here how I successfully get it to replicate in the first place, but I am hoping someone can help me drive it home by sharing a way to optimize it, or offer potential alternatives solutions.

This is all blueprint, so maybe C++ can help? I also hear of a concept called Delta Compression, where only changes are replicated, which I believe Unreal has by default, but I am unsure, and don't know if it's in use or requires a special setup.

Lastly, I theorize a way to maybe separate the body, hands, and face tracking as separate Live Link Poses, so we could individually control if the hands or face replicate based on distance. I am sure the body replicating alone will lag much less than hands and face alongside every frame of replication. Essentially, the fingers and face of other players don't need to replicate if they are far enough away, if that helps performance.

Still if a player is up close, and everything is active, it still would lag as shown here. An overall optimization is needed. Lastly, this is 2 players. What if we wanted 10? 100? 1000?!

Here is my setup so far:

Within the Avatar actor itself:

Aeromixx_0-1713210711231.png

First: My game is multiplayer and uses Linux Dedicated Servers, as they are faster & cheaper than Windows. The MetaXR plugin does not work on Linux, so this step is to get around crash errors on the Linux build, where if any reference to the plugin is made from the server, it has no Linux binaries and will cause an exception error and crash.


Within the avatar, in part of the Begin Play events, I check if the game is running on Linux or a Dedicated Server, and if it's not, spawn a new class I made called BP_BodyTrackingActor. This actor handles everything for body tracking, but separated for the sake of not crashing Linux, everything within this actor can be done directly in the avatar, IF your game is single player, uses Windows Dedicated Servers, or Meta eventually adds Linux support to the MetaXR plugin.

Event Begin Play

Aeromixx_1-1713211307365.png

First, I do the same Linux check just in case, you can skip that. Then I get a reference to the owner, which is the Avatar that spawns it. Delay, helps often. Then I follow the same chain of blueprints from the UnrealMovementSample provided by Meta. I do a fail check each time, if it fails, try again until it works. The order of nodes works without this, but for multiplayer and slower hardware sake, I do this to ensure no step is ever skipped to prevent bugs.

1. Take the MetaXR Preset and Apply to Client, 
2. Call On Feet Ready to be Grounded in the Anim Instance
3. Call a replicated function in the AnimInstance, setting FBSDisabled to 0

Aeromixx_2-1713211669095.png

Set these events to reliable in the details panel, ensure your variables are set to replicated in the details panel! (little 2 white spheres in the top right corner)
4. 0.25 Second Delay

5. Request Body Tracking Fidelity to High (with a Fail Check)

6. Stop Body Tracking, not sure why, but this is how the sample works and it works. (with a Fail Check)

7. Start Body Tracking by Joint Set to Full Body (with a Fail Check) 

8. Open Gate function for the Tick events within the BP_FullBodyTrackingActor, (or your Avatar if singleplayer/etc explained above)

Event Tick:

Aeromixx_0-1713212142292.png

1. First you see the Gate, with the OpenGate Function at the end of the BeginPlay events.
2. Linux Check again (optional)
3. Evaluate Live Link Frame with the Body subject selected, set the result as a variable.
4. Get the Hips transform to see if tracking is even active, if so, set the position in the avatar, replicated. The replicated data is all within the avatar itself, NOT the Blueprint, because this actor does NOT exist on the server, it's purely a helper for the game client to relay the info from the headset to the avatar. This is necessary to bypass Linux issues.
Event  Tick (Cont'd):

Aeromixx_1-1713212364702.png

5. Just like the sample, then get the Left and Right Ankle location's to replicate their positions in the avatar as well.
Replication Events of Hips and Left and Right Ankle.

Aeromixx_2-1713212578108.png
Don't set these to reliable. In bandwidth, we can afford for these to drop frames, and making them reliable makes the network flood that much more.


6.  Update Movement Delta is related to my project, as it determines how much the player is moving to blend animation states between FBT and the rest of the animation system in my game. Slightly irrelevant to review.

7. Meta IK Snapshot Pose within the Avatar. This is where the Magic Happens!

Within the Avatar, (if the AnimInstance is valid, related to my game), I get the Mesh and use what's called Snapshot Pose. With this node, drag off the Snapshot and set is as a variable.

Then Replicate that Snapshot pose. Make sure you variable is Replicated with RepNotify

Aeromixx_6-1713213655113.png

Within the variable's function (mine is: OnRep_Active Meta IKPose Snapshot)

Aeromixx_7-1713213697466.png

Set the Pose to the Pose variable within the AnimInstance (you'll have to make this variable in the AnimInstance)

This is the chunk of data that is the entirety of your avatar's current pose of all bones and sending it as a massive chunk over the network, per frame. It's heavy and inefficient, but it works! The whole point of this post is to potentially discover a more effective way or maybe this way working at all will stir up a different solution in someone's mind.

Here is my experiences with setting the IK Pose Replication server event as Reliable.
If you set it to Reliable, the Replication works at a Solid Framerate!
However, it's so much data, that there is no extra room, not even for player Movement.

If it's reliable, your players will typically lag, like in Call of Duty where it keeps pushing you back from where you're trying to go. The IK pose overloads the network, leaving no bandwidth for even basic commands, like jumping.

*ALSO* on the Quest (Standalone) it will flood the network buffer and cause a disconnect from the server:  [FBitWriter overflowed! (and Client Disconnected)]
(This may be Android related, Windows computers seem to hold up, just with heavy lag)

If it's NOT set to reliable, your characters can move! Although, as shown in the video, the replicated Full Body Tracking will run at a slow FPS. Without reliable, I also notice that the Quest (Standalone) player movement is not replicated on Windows PC, or updates once every 10ish seconds, but the FBT replicates. The Windows client moving around moves fine on the Quest (Standalone). Likely more network flood related issues or Android specific. So many variables and testing!

With this replicated IK pose, the rest of the work is happening within the AnimInstance.

1. Within the Event Blueprint Update Animation, just check if the Player Pawn and the owning Pawn of the AnimInstance is the same or not. Set this as a bool LocalPlayer. This determines in multiplayer: is this your AnimInstance or is it another player's?
Aeromixx_4-1713213456157.png

In the AnimGraph, modify the example to have this Blend Poses by bool node with the Local Player
(Note: for this to work, you MUST click the Pose Snapshot node and change the Mode in the details panel to Snapshot Pin.)

Aeromixx_5-1713213605882.png

Now, if LocalPlayer is true, that means your avatar, it will track as normal in SinglePlayer.
If LocalPlayer is false, it means it belongs to other players, and instead of the MetaXR Body/your headset's LiveLink source, it will use the replicated Pose that the other user is replicating to their avatar via the methods above.

Note: Anything AnimInstance related is from the UnrealMovementSample, all variables, functions, and what not you may be missing is because this is adapted from the sample project. You can find the incredible single player tracking sample here: https://github.com/oculus-samples/Unreal-Movement


This essentially works, but is an unfinished solution. I am pretty stumped, but I feel I got far, if anyone can help figure this out, that would be incredible!!! I really want this in my game, it's really just too good.

VRChat has large-scale IK replication, so it has to be possible, but once again, it may be the hands and face, as that increases the number of bones by a great deal.

Last major note: this is between multiple devices on the same LAN, all of this performance likely will be even worse over the internet, and it's once again, only 2 players so far!

Here is a video of the current state that it's in, working, but certainly not fully usable and in-progress: https://youtu.be/1-BLMJIbpvY

Any help would be greatly appreciated!!!

Thank you,

-Brandon







 







 

0 REPLIES 0