cancel
Showing results for 
Search instead for 
Did you mean: 

Meta Avatars SDK (Feedback/Issues)

mouse_bear
Retired Support

Do you have any feedback and/or issues in regards to the Meta Avatars SDK? Use this place to discuss, as we'll have members of the engineering team reviewing this thread!

 

Read the blog on the Meta Avatars SDK here: https://developer.oculus.com/blog/meta-avatars-sdk-now-available/

 

Refer to the Meta Avatars SDK documentation here: https://developer.oculus.com/documentation/unity/meta-avatars-overview/

113 REPLIES 113

I noticed this on v20 as well. Does rolling back to v18 fix things?

torsten.knodt
Explorer

I found two bugs in the Meta Avatar 2 SDK.

  • Shader error in 'Avatar/Horizon': Couldn't open include file 'AvatarCommonSurfaceFIelds.cginc'. at Assets/Oculus/Avatar2/Example/Common/Shaders/Horizon/AvatarCommon/AvatarCommonLighting.cginc(11)
  • Shader error in 'Avatar/Horizon': Couldn't open include file '../FBToneMapping.cginc'. at Assets/Oculus/Avatar2/Example/Common/Shaders/Horizon/Horizon/VertexGI/Interface.cginc(15)
Both are caused, because the case between filename and reference in the cginc file not being consistent.

Feedback/ feature requests:
  • All-in-One example with other SDKs, especially interaction
  • Configuration of bot avatars in the Unity inspector (not with number, but more like the editor in the Quest)

Hi, do you know if there is any documented info on how to animate avatar properly? The only demo that exists is NetworkLoopbackExample and it consists of streaming data which is mentioned as "not recommended" in documentation... I did found out a way to record and play this streaming data but the animation gets stuck because recorded data also includes local time somewhere along the bytes... the only workaround for this is to pingpong animation so the delta time doesnt get too big and the animation continues. I described this in this post 

There is an option to remove this timings from data stream but there is no point in editing raw byte data without knowledge of the underlying structure. So either we have to deserialize the data to be able to change it or obtain secret knowledge on how to actually animate the avatars.

There ARE methods to do that, they are called ovrAvatar2Animation_SampleAnimationClip but are not documented. The ovrAvatar2Animation_SetMood is straight forward and works nicely but others require pointers to animationAsset...

I'll write a ticket to oculus help to contact developers on any info that can help is with this and update my post or reply here.

Follow up!
I miracleously solved the mystery (although its still work-around) but this time with no drawbacks.
zeroing-out 2 values in avatar data apparently responsible for ovrAvatar2StreamingPlaybackState.remoteTime
these two values indexes are 12 and 13 (in all low/medium/high data types)

Are these each 1-byte in length?

Yup, these are values form 0 to 255

hey,

did you find any solution, coz im facing the same issue !!!

CHIDWI
Honored Guest

Hello! I'm working in Unity 2021 with Oculus SDK and Meta Avatars SDK on a multiplayer game and there is a mirror object in the scene, there was a problem with the fact that the avatar was turned on in FirstPerson mode and the head was not displayed, the problem was solved by switching to ThirdPerson, but now I noticed that avatars with hats see part of the cap or hat from the main camera. I tried to solve the problem by creating a copy of the avatar (mirroring) and adjusting the layers, but at the same time, you can see the layering of avatars and a slight synchronization of movements. Are there any ways to solve my problem, is it possible to enable full body rendering for a separate camera when the first-person mode is enabled? and will the problem with partially visible hats in the first-person view (but in the script it is enabled from the third person) be solved in the future?

jasonflow
Honored Guest

Avatar SDK with Quest for Business: how to use Shared Mode?

Our application expects that users can walk into a conference room and pick up a headset, and get into a shared experience as quickly as possible, some colocated in MR, some remote. Maybe they add a participant name, but there is no expectation that they will already have a Meta account. The Shared Mode of Oculus for Business is good, but the Avatar SDK requires a Meta ID to function. Is there any way to support generic Meta Avatars for Shared Mode usage?

joebrown.868167
Honored Guest

Something went wrong