Showing results for 
Search instead for 
Did you mean: 

Welcome to the early access group for the Oculus Expressive Avatars update

Heroic Explorer
We appreciate your discretion during this early access period.
Please don't discuss these updates or share publicly any images or video of expressive Avatars in action.


Welcome to the early access beta for the expressive avatars update. New micro-expressions and more options for customization enhance the way Oculus Avatars facilitate meaningful social presence and interactions in your apps and across platforms. New system highlights include:

  • Eye gaze simulation based on observed human behaviors such as micro-saccades, smooth pursuit and ballistic gaze shifting, including the ability to specify gaze targets in your scene.
  • Simulated mouth movement builds on top of OVRLipSync and introduces differential blending between individual muscle movements for a cleaner read of what someone is saying. Facial micro-expressions give the avatar a more nuanced and believable performance.
  • New parameters for customization (eye, lip, lash, brow color) and an updated approach to translucency, make it simpler to set up avatars to render correctly in your app, and give users greater freedom for personal expression.
  • For more information on what's coming, check out our OC5 talk on the technologies and learnings behind this update:

Key considerations for this update

  • The update introduces blendshape-driven facial expression, these blendshape weights are networked in the same way as the avatar packet data for pose updates.
  • We've updated the shader to make use of new mask channels in the Roughness texture (Green and Blue channels), in order to support parameter-driven colors for lips, eyes, brow and lashes. This masking is also used to drive the reflection calculations on the eye, simulating a lens.
  • The shader has also been updated to use an alpha-to-coverage approach for opacity, which runs entirely on the geometry queue. This should be cheaper and easier to implement vs. the prior version, as draw ordering was sometimes problematic. It also unlocks translucent eyewear options!
  • The eye behavior is based upon a gaze target model which can be populated with salient objects in the scene (e.g. other people, a TV-screen). In the absense of gaze targets, the eye will move between a distribution of sensible targets in the field of view of a person wearing a VR headset, with appropriate movement simulation to match the behavior of a person looking at a scene and occasionally shifting their focus.
  • We've slightly increased the vert count of the body asset to add the mouth and eye geometry. However, the expectation is that, with people now choosing to not wear eyewear, the average vert. count for an avatar will remain more or less unchanged.


  • The expressive update is scheduled to launch in March, in conjunction with the update of our Avatar Editors on Oculus Go and Rift
  • Between now and then we'll regularly update this group with latest builds of the Unity and UE integrations of Avatars, along with release notes.
  • The core SDK components have now landed on PC and will be landing shortly on Go / Gear, meaning that you should be able to use the integrations provided, without needing to sideload a DLL locally.
  • That said, where we make small adjustments to the system avatars .DLL / .SO files to tune any behaviors based on your feedback, you'll need to sideload these to get the latest and greatest.

We'll be kicking things off with the Unity integration, I'll post an update to this group for each build.

UE integrations will begin shortly thereafter.
At this time we won't be updating the Native sample, however we plan to do this over the next few weeks.


Expert Protege
Some love soon for Unreal :wink: ?

Heroic Explorer
@beaulima9933 -- :smile:

Expert Protege

A quick update on the expressive Avatars
I've put them in my project in latest 4.21 Oculus-branch
Works perfect in single player;
I wonder why the OvrAvatarManager is now a tickable object, since it introduces
a lot of nightmares when porting for multiplayer.
Anyway, successfully did multiplayer (by using the same network replication code we use in TribeXR), but it's
laggish as hell. I'll check tomorrow my app vs TribeXR daily build but I would be surprised if it works better. Anyway Dan wants to promote expressive avatars, so I/we'll check/resolve what's the issue very very soon I suppose.

So that's it, I suspect the absence of 4.22 integration is due to to the ExpAvatars integration/optimization?