cancel
Showing results for 
Search instead for 
Did you mean: 

Unreal Engine - Avatar Expressive Update

Ross_Beef
Heroic Explorer
CONFIDENTIAL
We appreciate your discretion during this early access period.
Please don't discuss these updates or share publicly any images or video of expressive Avatars in action.
We'd love your feedback. Please keep all comments / discussions regarding bugs or work in progress with expressive avatars restricted to the Avatar SDK Developer Preview discussion, and avoid posting to the broader public avatar discussion thread.

Hey folks,



As promised, we've made available the PC UE4 update for Expressive Avatars. Quest update coming shortly but we wanted to get the PC functionality out the door.

Release Notes

SDK
  • Updated avatar meshes to include added eye / mouth geometry and blendshapes for expression and speech
  • Updated avatar textures for skin, including mask information in the Green and Blue channels of the roughness texture
  • Added runtime parameters to the avatar spec for eye color, lip color, brow color and lash color
  • Added simulation models for eye gaze, mouth movement and general facial animation

UE
  • Added support for different methods regarding alpha. We recommend using Masked material for performance and blending into the scene better. It uses a masked material, instead of the previous translucency effect.
  • Added support for Expressive avatars, including integration with Lipsync, gaze targets etc.
  • Added ability to set static or moving gaze targets

Known issues


  • This is PC only. Brave is the developer who tries to make this work on Quest. (As above, coming soon)
  • Currently supports UE 4.21 only
  • This build is tested against the 1.36 software update. While avatar support for expressive was also shipped in the Oculus runtime with update 1.35, some functionality was not enabled. If you haven't seen 1.36 update, consider opting into the Public Test Channel.

Instructions



  • LocalAvatar.h/cpp shows how to wire up and expressive avatar, below (see code block)
  • Build and run AvatarSamples. In editor you will will see a number of new options exposed to configure the Avatar (image #1, below). Ensure that 'enable expressive' is set.

User IDs

Please conduct initial testing using one of the following user IDs to retrieve an Avatar with an updated face mesh with expressive blends and no eyewear.
  • 10150030458727564
  • 10150030458738922
  • 10150030458747067
  • 10150030458756715
  • 10150030458762178
  • 10150030458769900
  • 10150030458775732
  • 10150030458785587
  • 10150030458806683
  • 10150030458820129
  • 10150030458827644
  • 10150030458843421
Going forward, all users will automatically be backfilled with an updated face mesh that takes advantage of both gaze and facial expressions. They'll have the option to remove their current eyewear option, exposing the eyes, as we launch this update into the avatar editors on PC and Go.

Gaze Targets


We've added the ability to tag objects of interest in the scene. To do so attach a UOvrAvatarGazeTarget to an object. This is exposed through the editor for convenience (see Image #2, below). There are 4 types of tags to apply:

  • Avatar Head - This is automatically attached to avatars in the scene.
  • Avatar Hands - Also, automatically attached to the hands.
  • Object - A moving object in the scene.
  • Static Object - Still object in the scene

Unless otherwise specified in by code using :
void SetGazeTransform(USceneComponent* sceneComp)
...the transform will point to the root object scene component that it is attached. It may desirable to tune that on larger objects to refine the point of interest.






UE Avatar options

jewp2l0rw43v.jpg

LocalAvatar.h/cpp

void ALocalAvatar::LipSyncVismesReady()
{
if (UseCannedLipSyncPlayback)
{
AvatarComponent->UpdateVisemeValues(PlayBackLipSyncComponent->GetVisemes());
}
else
{
AvatarComponent->UpdateVisemeValues(LipSyncComponent->GetVisemes());
}
}
void ALocalAvatar::PreInitializeComponents()
{
Super::PreInitializeComponents();
if (UseCannedLipSyncPlayback)
{
FString playbackAssetPath = TEXT("/Game/Audio/vox_lp_01_LipSyncSequence");
auto sequence = LoadObject<UOVRLipSyncFrameSequence>(nullptr, *playbackAssetPath, nullptr, LOAD_None, nullptr);
PlayBackLipSyncComponent->Sequence = sequence;
FString AudioClip = TEXT("/Game/Audio/vox_lp_01");
auto SoundWave = LoadObject<USoundWave>(nullptr, *AudioClip, nullptr, LOAD_None, nullptr);
if (SoundWave)
{
SoundWave->bLooping = 1;
AudioComponent->Sound = SoundWave;
}
}
#if PLATFORM_WINDOWS
else
{
auto SilenceDetectionThresholdCVar = IConsoleManager::Get().FindConsoleVariable(TEXT("voice.SilenceDetectionThreshold"));
SilenceDetectionThresholdCVar->Set(0.f);
}
#endif
// TODO SW: Fetch Player Height from Oculus Platform?
BaseEyeHeight = 170.f;
AvatarComponent->SetVisibilityType(
AvatarVisibilityType == AvatarVisibility::FirstPerson
? ovrAvatarVisibilityFlag_FirstPerson
: ovrAvatarVisibilityFlag_ThirdPerson);
AvatarComponent->SetPlayerHeightOffset(BaseEyeHeight / 100.f);
AvatarComponent->SetExpressiveCapability(EnableExpressive);
AvatarComponent->SetBodyCapability(EnableBody);
AvatarComponent->SetHandsCapability(EnableHands);
AvatarComponent->SetBaseCapability(EnableBase);
AvatarComponent->SetBodyMaterial(GetOvrAvatarMaterialFromType(BodyMaterial));
AvatarComponent->SetHandMaterial(GetOvrAvatarMaterialFromType(HandsMaterial));
}
ALocalAvatar::ALocalAvatar()
{
RootComponent = CreateDefaultSubobject<USceneComponent>(TEXT("LocalAvatarRoot"));
PrimaryActorTick.bCanEverTick = true;
AvatarComponent = CreateDefaultSubobject<UOvrAvatar>(TEXT("LocalAvatar"));
PlayBackLipSyncComponent = CreateDefaultSubobject<UOVRLipSyncPlaybackActorComponent>(TEXT("CannedLipSync"));
AudioComponent = CreateDefaultSubobject<UAudioComponent>(TEXT("LocalAvatarAudio"));
LipSyncComponent = CreateDefaultSubobject<UOVRLipSyncActorComponent>(TEXT("LocalLipSync"));
}
void ALocalAvatar::EndPlay(const EEndPlayReason::Type EndPlayReason)
{
LipSyncComponent->OnVisemesReady.RemoveDynamic(this, &ALocalAvatar::LipSyncVismesReady);
PlayBackLipSyncComponent->OnVisemesReady.RemoveDynamic(this, &ALocalAvatar::LipSyncVismesReady);
if (!UseCannedLipSyncPlayback)
{
LipSyncComponent->Stop();
}
}
void ALocalAvatar::BeginPlay()
{
Super::BeginPlay();
uint64 UserID = FCString::Strtoui64(*OculusUserId, NULL, 10);
#if PLATFORM_ANDROID
ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Three;
if (AvatarComponent)
{
AvatarComponent->RequestAvatar(UserID, lod, UseCombinedMesh);
}
#else
ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Five;
IOnlineIdentityPtr IdentityInterface = Online::GetIdentityInterface();
if (IdentityInterface.IsValid())
{
OnLoginCompleteDelegateHandle = IdentityInterface->AddOnLoginCompleteDelegate_Handle(0, FOnLoginCompleteDelegate::CreateUObject(this, &ALocalAvatar::OnLoginComplete));
IdentityInterface->AutoLogin(0);
}
#endif
if (UseCannedLipSyncPlayback)
{
PlayBackLipSyncComponent->OnVisemesReady.AddDynamic(this, &ALocalAvatar::LipSyncVismesReady);
}
else
{
LipSyncComponent->OnVisemesReady.AddDynamic(this, &ALocalAvatar::LipSyncVismesReady);
LipSyncComponent->Start();
}
}

Gaze target Configuration

vy4zm915okge.jpg
2 REPLIES 2

Ross_Beef
Heroic Explorer
(realized I forgot to add the info on setting Gaze targets. Added!)

Ross_Beef
Heroic Explorer
Adding the latest version, which has support for UE 4.22 and Quest!
https://www.dropbox.com/sh/mxw4u3ww60jcrnq/AAAAykuBnsZSJBB572OIFrRha?dl=0