cancel
Showing results for 
Search instead for 
Did you mean: 

When are hands loaded from OVR Avatar is there an Event I can subscribe to?

DatDudeAnt
Explorer

So I'm developing a VR game with a full body avatar for the Oculus Quest. I've already developed the Player Rig for SteamVR and would like to convert this to Oculus. I've managed to get the arms and legs working, but now I need to rig the fingers properly. For steam, I can listen for a function called RenderModelLoaded and it would tell me when the hands are there and available. When those hands spawn in, I "link" the movement of the SteamVR fingers to the fingers of the full body avatar. So when using the Index controllers, the player has finger tracking. Following the same architecture. I'd like to use the hands that spawn in from the OVR Avatar, but I don't know if there's a function I can listen for or an event I can subscribe to that says "Hey the hands have loaded" now rig the avatar.


I'm curious as to how others are currently detecting when the hands have loaded? I've also thought about creating a set of custom hands that get loaded and was curious as to how other have approached that method as well.

 


Any assistance is greatly appreciated?

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

Here's an article about the new avatars:

https://www.theverge.com/2021/4/23/22398060/oculus-new-avatars-editor-features-vr-virtual-reality-fa...

They're already available in PokerStars VR (one of the only shipped applications that used the original OVRAvatars). Don't know the timeline for general rollout.

 

It shouldn't be difficult to instantiate the custom hands at runtime using Resources.Load. From there you could do the same thing you're doing with SteamVR/OVRAvatar hands, let the bones from the hands puppet your avatar's hands.

 

It's great that the OVRAvatar system provides well-registered and animated hands *almost* for free, but I spent ages fighting with OVRAvatars to do something similar to what you're doing. It required hacks and patches to the OVRAvatar C# code that had to be reapplied every time a new Oculus Integration came out. It ended up not being worth it.

 

View solution in original post

6 REPLIES 6

julienkay
Adventurer

I was facing the same issue and found absolutely no better way than to implement such an event myself by accessing the private assetsFinishedLoading field of an OvrAvatar via Reflection.

 

Code goes something like this (this checks whether the whole avatar has loaded completely though) :

private FieldInfo _assetsFinishedLoadingField = typeof(OvrAvatar).GetField("assetsFinishedLoading", BindingFlags.NonPublic | BindingFlags.Instance);

private bool loaded;

private void Update() {
  if (!loaded) {
    loaded= (bool)_assetsFinishedLoadingField.GetValue(OvrAvatar);
    if (loaded) {
      // Invoke AvatarLoaded event here
    }
}

You could also just make the field public I guess, but you'd need to do that every time you upgrade the integration. Pick your poison.

 

I don't have any advice on your second question unfortunately.

 

DatDudeAnt
Explorer

@julienkay  Thank you for your response. I had considered making a child class that inherits from OVRAvatar and then create an event that gets fired off at the end of the base function AssetLoadedCallback() or CombinedMeshLoadedCallback(), but wasn't sure if those were the absolute points where the hands get loaded. 

 

Have you ever looked at those functions? From your perspective, would this approach be a viable option?

 

I didn't know assetsFinishedLoading was a field I could look at. I appreciate your insight and will definitely try your approach.


@DatDudeAnt wrote:

I didn't know assetsFinishedLoading was a field I could look at. I appreciate your insight and will definitely try your approach.


Just to be clear I usually only suggest using Reflection as an absolute last ditch effort.

 

Both AssetLoadedCallback() and CombinedMeshLoadedCallback() seem to be related to 'internal' loading and are invoked before the actual Unity GameObjects are created, which I think is not what you want.

 

Of course there's also the whole "new Avatar system is right around the corner" problem. I'm not sure whether I would spend a lot of time into writing much complex stuff for the current system.

Anonymous
Not applicable

I agree with @julienkay about the "new Avatar system." There may have been tweaks to the native OVRAvatar library, but the Unity C# interface and shader code has been stagnant for at least a year. The fact that avatars still don't support URP, and don't have these basic loading events are indications of that.

 

The hacky solution I used was a coroutine that spun until the transform hierarchy under the avatar prefab changed. I've stopped using OVRAvatar entirely though.

 

I'd recommend looking at other solutions for animating/posing hand meshes:

https://github.com/MephestoKhaan/HandPosing_Demo

https://assetstore.unity.com/packages/templates/systems/vr-interaction-framework-161066

 

(There's also a CustomHands scene in Oculus/SampleFramework of the integration asset.)

@Anonymous Thanks for the feedback. I wasn't sure if I was going to get any kind of response, so I appreciate the information.

 

Is there any reference articles mentioning the new Avatar system? Since posting my question, this the first time I've heard about. it  I did see this: https://developer.oculus.com/downloads/package/oculus-avatar-sdk/

but all it mentions is that developers should upgrade when Avatar 2.0 is released. 

 

I have considered creating custom hands, like the one in the Oculus/SampleFramework, but when I saw the hands were available via OVR Avatar, I liked how it closely matched SteamVR. For SteamVR the hands for the Interaction System are spawned in. Once they're spawned and ready, I "rig" the Full Body Player Avatar to those hands. I wanted to keep the architecture of my code the same, so if I had an event I could listen for in OVR Avatar, it wouldn't be a problem. Although there will be a new avatar system coming, I probably won't be using it the way it was intended. In my case, I'd just be getting the real time animation data from the OVR and SteamVR hands and mapping them to the full body player. The hands would be hidden from view, but they're essentially there. 

 

I have strongly considered going the Custom Hands route via the Oculus/SampleFramework method. The only difference is that the hands are already in the scene versus being spawned it. Which breaks from the current SteamVR implementation. Not to say I'm stuck with that method, I'm just looking to keep things consistent and write clean code. 

 

Once again thanks for your response. 

Anonymous
Not applicable

Here's an article about the new avatars:

https://www.theverge.com/2021/4/23/22398060/oculus-new-avatars-editor-features-vr-virtual-reality-fa...

They're already available in PokerStars VR (one of the only shipped applications that used the original OVRAvatars). Don't know the timeline for general rollout.

 

It shouldn't be difficult to instantiate the custom hands at runtime using Resources.Load. From there you could do the same thing you're doing with SteamVR/OVRAvatar hands, let the bones from the hands puppet your avatar's hands.

 

It's great that the OVRAvatar system provides well-registered and animated hands *almost* for free, but I spent ages fighting with OVRAvatars to do something similar to what you're doing. It required hacks and patches to the OVRAvatar C# code that had to be reapplied every time a new Oculus Integration came out. It ended up not being worth it.