cancel
Showing results for 
Search instead for 
Did you mean: 

Expressive Update - Launch, updated Unity integration

Ross_Beef
Heroic Explorer
CONFIDENTIAL
We appreciate your discretion during this early access period.
Please don't discuss these updates or share publicly any images or video of expressive Avatars in action.
We'd love your feedback. Please keep all comments / discussions regarding bugs or work in progress with expressive avatars restricted to the Avatar SDK Developer Preview discussion, and avoid posting to the broader public avatar discussion thread.





Hey all, thank you for your ongoing participation in the developer preview.

I wanted to keep everyone updated with the latest developments (and a unity package) and chat through launch timings.




Launch timings

  • We’re launching expressive next week (April 3rd at 9am). At this time we will enable all users to access the updated avatar editors on Go and Rift, where they'll be able to remove their eyewear and augment their avatar with eye color, brows, lipstick and lash coloring.
  • We've backfilled and updated all users to an expressive avatar already. This means that, at any point, when you publish an app that's requesting the Expressive avatars, their mouths will start working. Prior to 4/3, however, users won't have the ability to remove eyewear.
  • Please note that we've made a change to how gaze targets work (below). You'll want to verify your implementation using our Test IDs (which don't have eyewear)
  • Reminder that the launch date, and assets, are all confidential until we go live.

Media coverage
  • We will be proactively communicating to media outlets over the next few days about the upcoming update and readying our blogs and other comms.
  • If you are confident that your app will be updated by 4/3, please message me directly to let me know, as we will look to mention live apps which users can take their expressive avatars to as of launch date. We'll also be mentioning a subset of apps that will update shortly -- so let me know either way!
  • We'll be publishing a blog post specific to this update. We'll also prepare some assets to share / along with relevant copy, if you'd like to participate on your social media channels.

Playtesting
  • The avatars team is happy to playtest your app and provide critique on the gaze modeling implementation or any custom shader work you've done (acknowledging that we now do some fun things with masking). Please message me and we can coordinate adding emails to release channels on Oculus store.







Latest drop (1.36 SDK) 



LINK: https://www.dropbox.com/sh/qvegtvnvqbrbec4/AAC-Z4FbfAYNK0lNW5BSbC0Ma?dl=0




Release notes




  • This is actually the SDK
    1.36 package that’s going live roughly around the same time as the SDK
    (and would be available via Unity store, albeit a little too late for you
    to take the updates for our live date)

  • You’ll not need to copy
    over the avatars DLL as the Oculus Runtime for 1.36 should now have rolled
    out to your Rift.

  • The runtime for Quest
    should be 3.60 – you should have this version by now if you’re getting the
    developer updates.

  • For Steam, specifically,
    you can just copy the DLL from your runtime (search your PC for
    libovravatar.dll and OvrAvatarAssets.zip). Let me know if you can’t find
    these, I can upload them for you.



 



Change list




  • We’ve updated gaze
    targets to move away from a tag system (based on feedback) and toward
    component based gaze tagging. You can now assign targets using the unity
    interface (pictured, below), or via code
    • GazeTarget newGazeTarget = gameObject.AddComponent<GazeTarget>();
    • newGazeTarget.Type = ovrAvatarGazeTargetType.AvatarHead;


  • The system allows you to
    specify 4 levels of gaze saliency (in descending order): Avatar Head,
    Avatar Hand, Object, Stationary Object

  • We’ve also fixed some
    issues that were affecting how you implement avatars:


    • Resolving
      an broken behavior simulation state where no mic perms are granted on Quest (though you should ensure
      you’re requesting Mic permissions!)

    • Resolving
      an issue wherein IL2CPP was throwing errors

    • Resolving
      an issue which was throwing debug errors when ‘Third Person’
      was set to True

  • Added the ability to configure the avatar to use the transparent render queue (image below)



 



Some implementation gotcha’s we’ve come across




  • Unity seems to default
    to enabling OpenGL 3.0 / Vulkan support, which isn’t supported on Go /
    Quest. You’ll want to disable this

  • For Android, ensure
    you’re setting ASTC compression





Image: Opacity option
m6y226mudxht.png


Image: Gaze Targets
gxh24vku69yk.jpg
1 REPLY 1

MikeF
Trustee
Quick question regarding launch. You mention that existing avatars have already been updated, but this doesn't appear to be the case. Some avatars are working, others are missing eyebrows and eyes, others have no eyes or mouths. Is the update still in progress?

Also, should we keep "Enable Expressive" true in the unity integration or will having that turned off still allow expressive once it goes live?