cancel
Showing results for 
Search instead for 
Did you mean: 

Avatars SDK - Latest 11/04/2016

Ross_Beef
Heroic Explorer
The third drop of the native Avatar SDK is now available!
As always, please let us know if you run into any issues so that we can work quickly to resolve them.

New features in this release


DLL and assets distributed with the Oculus runtime


You may notice that this distribution no longer contains a “DLL” directory with the libovravatar shared library and the prepackaged assets. This is because they’re now being distributed as part of the SI 1.10 release. Instead of distributing this DLL with your application, it will be distributed for you to anyone who has the Oculus runtime installed.



*IMPORTANT* You should no longer distribute old DLLs with your application. If you do, your application won’t pick up patches made to the DLL via updates to the Oculus runtime. This means your app will eventually become incompatible with the assets distributed via normal Oculus system software updates.



SI 1.10 goes out to users later this month, but you should already be part of our ALPHA release channel, which now has the libovravatar DLL. You don’t need to do anything special to target this version, as these paths are automatically included in the system DLL path by the Oculus runtime.



Access to the Avatar Editor


We’re rolling out access to the Avatar Editor through Home.


tdb4f6ns8l6i.png

Above your profile in Home, you should now see a floating bust of your avatar. If you highlight your avatar, an “Edit Avatar” button should roll out, which you can use to launch the editor.



When you configure your avatar and exit, you should see it update in Home. Also, you should now be able to query the avatar using your user ID from the SDK.


Avatar app ID initialization


It is now required for you to initialize your app with a valid app ID in order to retrieve avatar specifications for specific user ID’s. If you don’t initialize with a valid ID, your app will only receive the default fallback avatar.



You can do this by passing your app ID to ovrAvatar_Initialize on Native. In Unity the package adds a new “Oculus Avatars” menu item to the toolbar with an “Edit Configuration” option that will let you specify your app’s ID.



Base cone implementation


Avatars now feature an optional “base cone” used to get a hint of body presence and indicate to other users where the avatar is positioned in the scene. From a third person view it looks like a tall cone ascending to where the torso fades in. In first person, it appears as a small disc.



The base cone will automatically animate to track the estimated body position for the avatar at runtime. If you want to lock the base cone in place, you can use the ovrAvatar_SetCustomBasePosition function and provide an x,y,z coordinate in the tracker space for the avatar.



The cone is enabled by default if you intialize your avatars with ovrAvatarCapabilities_All, but if you don’t want it to appear in your apps, you can initialize with a different capabilities mask and exclude ovrAvatarCapability_Base.


e0n27lenvszg.jpeg


Voice implementation


In the last update, we added support for “projector” renderers. This was in preparation for enabling our voice visualization method, which is a projected effect on the mouth area. The voice component contains a transform specifying the projection volume along with a dynamically modified material that’s modulated with the voice effect.



This feature is enabled by default, but will only be animated if you provide microphone data to the interface. You can do this using theovrAvatarPose_UpdateVoiceVisualization via the native SDK or by calling OvrAvatar.UpdateVoiceVisualization in Unity and providing the voice samples. Low-latency samples can be retrieved from the Rift microphone using the ovr_Microphone* APIs provided by the Oculus Platform SDK. A sample implementation of this is available in the native Mirror sample.



Like the base cone, you can disable voice visualization by excluding the ovrAvatarCapability_Voice flag from your capabilities mask when initializing an avatar.


gw99ua363ekh.jpeg

Self-occluding assets and sorting


A new visibility flag has been added called ovrAvatarVisibilityFlag_SelfOccluding. Most avatar parts are transparent, either due to fade textures on the body or hands, or due to transparency effects on parts like the new base cone, but some parts should be self-occluding (for instance, the you should not be able to see your fingertips through the back surface of your hand) whereas some shouldn’t (the base cone is transparent and double-sided). This flag identifies which render parts are which.



The Unity integration now ships with two shaders, one which is self occluding (which is implemented by doing a prepass render to the depth buffer) and one which is not. The native Mirror sample has a single shader but does the depth prepass by issuing another depth-only drawcall.



Additionally, in the Unity integration, avatar render parts are now assigned an explicit sort order to correctly order transparent objects instead of relying on centroid distance checks.



Custom grip poses in Unity


The Unity package now contains a new sample scene called “GripPoses” which has a debug visualizer for the hand’s skeleton and can be live-updated to pose the hand with a custom grip. You can also just drag these posed hands into your asset browser to create a prefab for use in your game.

hqcasmmobk2q.png



Active capabilities


Sometimes you want to selectively enable and disable Avatar capabilities at runtime. For instance, you may want to render a base cone for an avatar when it’s standing but disable it when it’s seated. Now you can call ovrAvatar_SetActiveCapabilites to turn on and off features at runtime.



What’s still to come


The following features are either not implemented or partially implemented:

· Semantic information about hands (grip volume sizes, point vectors, etc.) are not yet implemented.

· Packet recording/playback is not yet implemented (Unity users can still take advantage of the Unity stub implementation for now).



Most apps planning to ship for Touch launch have already done their own implementations of these features. If you are reliant on the Avatar SDK providing these features for launch, please get in touch with us as soon as possible so that we can help figure out a development strategy, since these are landing late in the development process.
2 REPLIES 2

StormyDoesVR
Heroic Explorer
Hi! I was wondering how to find a specific user'd ID number to put into unity? Per what you said up there I have an app ID in Unity but am having trouble finding the second part to make my custom avatar show up.
------------------------------
PC specs - GTX 1070 (MSI AERO); 16GB Corsair Vengeance DDR3; intel i5-4590, MSI motherboard (don't remember but it was a combo deal) and an EVGA Supernova 750W semi-modular.

pjenness
Rising Star

Rave185 said:

Hi! I was wondering how to find a specific user'd ID number to put into unity? Per what you said up there I have an app ID in Unity but am having trouble finding the second part to make my custom avatar show up.


Does this help:
https://forums.oculus.com/developer/discussion/comment/477074/#Comment_477074

-P

Drift VFX Visual, Virtual , Vertical Want 970GTX on Macbook for good FPS? https://forums.oculus.com/viewtopic.php?f=26&t=17349