10-22-2016 02:04 PM
The second drop of the native Avatar SDK is now available!
As always, please let us know if you run into any issues so that we can work quickly to resolve them.
[Please note that this is not yet ready for redistribution and is covered under our existing NDA’s]
We encountered a delay getting an early build of the Avatar Editor out to developers, so to help you test your rendering implementations, we’ve temporarily enabled some Test ID’s that cycle through various avatar materials. If you request a specification for userID 0-89, you will receive an avatar specification with a different material applied.
The Mirror app allows you to cycle through these test ID’s using the left and right arrow keys.
(Please note, this debugging feature will be removed in a future version once the avatar editor is available to allow you to customize avatars on your own).
Touch controllers are now supported natively. The SDK now takes a more complete input structure for the touch controllers, including button states and surface contact status, and uses these inputs to articulate the controllers as well as pose the hands dynamically based on the button presses, joystick movements, and finger proximity.
Custom grip poses are now supported. You can call ovrAvatar_SetLeftHandGesture/ ovrAvatar_SetRightHandGesture with one of the ovrAvatarHandGesture enum values to force one of the default grip poses. You can also call ovrAvatar_SetLeftHandCustomGesture/ ovrAvatar_SetRightHandCustomGesture and provide your own skeletal poses.
The Mirror sample provides an example usage. You can now press the F key to “freeze” the hand poses, which copies the current hand poses directly into the CustomGesture functions.
Calling Set*HandGesture with ovrAvatarHandGesture_Default will return the hands to their normal animation behavior.
Note - see the 'Reference' folder for .FBX files for the hands. These are pre-registration, so the bone placement will be offset from the actual hands. In a future build we'll provide the post-registration hands as imported into the SDK.
There is a new render part called SkinnedMeshRenderPBS that provides a skinned mesh and textures for albedo and surface properties. The surface texture follows Unity conventions:
This render part is used for the controllers, with textures matching the real-world surface properties of the shipping Touch controllers. A reference shader is provided with the native sample, but because the native sample doesn’t have a physically based lighting model, it just displays the albedo texture. The Unity package now includes a reference shader that shows how to hook these render parts up to Unity’s physically-based lighting model.
There is an additional new render part named ovrAvatarRenderPart_ProjectorRender. This render part provides a projection transform and material, and references another render part for the material to be projected onto. Though temporarily disabled, this is the technique we will be using for applying voice visualization effects on the avatar, and possibly to enable future stamp-based customization options.
Reference implementations of the projection technique are provided in both the Mirror sample and the Unity integration.
The ovrAvatarMaterialState object has been expanded with new features.
First, each texture field now has a companion Vector4 ScaleOffset value which is used to adjust the scaling and tiling of texture surfaces at runtime.
Second, there is now a base mask type on the root object that modules the overall alpha value of the avatar. This allows for material styles that have transparency effects based on the object geometry, and enables a new set of “holographic” material types.
Third, a new “Pulse” mask type has been added. This mask creates “waves” that pulse along an axis over time, adding secondary motion and life to several of the materials.
All of these new features are implemented in the native and Unity reference shaders.
In addition to ovrAvatarBodyComponent and ovrAvatarHandComponent, you can now query explicitly for the left and right ovrAvatarControllerComponent and the ovrAvatarBaseComponent. These will contain the “semantic” state of each of these components.
The base component currently has no visual representation but will be connected to an optional visual marker for the avatar’s spawn position in an upcoming release.
The following features are either not implemented or partially implemented:
10-24-2016 12:45 AM
10-24-2016 10:36 AM
10-25-2016 02:26 AM
10-26-2016 01:08 AM
10-26-2016 09:57 AM
10-26-2016 05:11 PM
10-27-2016 10:03 AM
11-04-2016 05:17 AM
11-07-2016 06:29 AM