Integrating Meta Avatars with Application Spacewarp
I am working on an application that requires the use of Application Spacewarp and Meta Avatars 2 SDK. In testing, I have determined that the shaders for the Meta Avatars are not creating Motion Vectors for Application Spacewarp to use to correctly render its faked frame. The rendering artifact that clued me into the issue was seeing the opaque Meta Avatars stuttering. According to the information from Meta's Application Spacewarp sample git repo (https://github.com/oculus-samples/unity-appspacewarp), this is a sign that the Meta Avatar shader is not generating Motion Vectors. Has anyone worked with Application Spacewarp and the Meta Avatar SDK2 as of yet who could advise me in how to address this issue? Any and all help would greatly be appreciated! I do have a possible initial lead in that in the Meta Avatar SDK2, there is a Recommended folder with a sub folder of "app_specific" which has 2 files; app-declarations and app_functions. These seem to be here for the purpose of adding app specific functionality to the Meta Avatar shader code. I am going to try and mess with them, but I don't have much experience in doing this kind of thing so again if the right people find this, any and all guidance on this point would also be appreciated!602Views0likes0CommentsMeta Avatar Joint Transforms Are In Incorrect Position After Calling LoadUser() Again.
I think I've found a bug in the meta avatars sdk version 17.2. After calling LoadUser() on an avatar entity that has already been loaded before(which may happen if you'd want to update an avatar's look), the transforms Joint Head, Joint LeftHandWrist and Joint RightHandWrist stop being in the correct positions and simply fix themselves at 0, 0, 0. Here are the steps to reproduce it: In a blank scene add a AvatarSdkManagerHorizon object and a empty gameobject with a SampleAvatarEntity component. Set the SampleAvatarEntity's BodyTracking input to be the AvatarSdkManagerHorizon's SampleInputManager. Add some code in the SampleAvatarEntity which will enable you to call LoadUser() in runtime. Ensure you have UseStandalonePlatform checked in your OculusPlatformSettings so that your own avatar loads. Connect your headset with Quest Link and run the scene to let your avatar load. In the hierachy see how the Joint Head is in the correct place. Now manually call LoadUser() and see how Joint Head is no longer in the correct place.1.5KViews3likes1CommentMeta Avatar Critical Joint Transforms go to zero position after calling LoadUser()
Our application depends upon attaching objects to each avatars hands and head. Unfortunately this means we can't use LoadUser() to update an avatar's appearance if they changed it because there seems to be a bug where the critical joint transforms just fall to the local zero position when LoadUser() is called. ot know where the users wrists and head are now Here are the steps to reproduce:1.7KViews2likes1CommentMeta Avatar Muiltiple precompiled assemblies error in Oculus Integration v44 (new issues in 2022)
I have a Unity project in version 2021.1.3.11f1, utilizing the Unity Asset store Oculus Integration version 44. I used the integration to enable SpaceWarp and the next step for my project was to import Meta avatars, but I ran into an issue that seems to be covered here previously. When bringing in the Meta Avatars SDK version 17.2, I get pre-compiled assembly errors. A previous post on the Meta forums had a solution which was to exclude the newtonsoft.Json.dll which came with the Avatar package, but this was with Oculus Integration v35 and seems to no longer work (https://forums.oculusvr.com/t5/Get-Help/Meta-Avatar-Error-Multiple-precompiled-assemblies/td-p/929944). When attempting to use this solution with all of the newest versions of Meta tech, I have to restart Unity in safe mode and get three errors: namespace 'OvrAvatarTrackingSkeleton' could not be found in two scripts, and 'OvrSpan<>' could not be found in a third. To my knowledge I am using the newest version of each solution here, so I can't understand how they would not work with each-other. If it helps, I only have the Oculus Integration Spatializer and VR components installed from the Unity package.1.7KViews1like3CommentsOptimizing avatar instantiation. High cost of creating & uploading textures to GPU.
Hello, I am trying to optimize instantiating avatars at runtime using Unity 2021.3.5f1. I am using the GPU Skinning for improved runtime performance. When I create a new avatar currently, there is a few frames where the frame time spikes while textures are created. There are 3 textures created over 3 frames. The three functions that run are: OvrAvatarGpuSkinnedPrimitive.CreateMorphTargetSourceTex OvrAvatarGpuSkinnedPrimitive.CreateJointsTex OvrAvatarGpuSkinnedPrimitive.CreateNeutralPoseTex I have tried reducing the texture precision to reduce memory created but this has little affect. I also looked into pooling the textures but their dimensions are based entirely on the specific avatar being loaded so this was not possible since I do not know the avatars ahead of time. Does anyone have any insights or tips on how I can reduce or mitigate this cost? Thank you853Views0likes0CommentsHow to detect if a user doesn't have a Meta Avatar or if they have been assigned a default avatar?
I am working with the Meta Avatars SDK and can't find any way to detect if a user doesn't already have a Meta Avatar, and as part of this I can't find a way to detect if the user has simply been assigned a default avatar. I have the ability to let the user open the customise avatar menu via a deeplink call, but no way to detect if I should tell the user to create an avatar first time if they don't have one?Solved1.4KViews0likes1CommentPhoton Voice And Meta Avatars 2
I'm wondering how anyone of you have managed to resolve a conflict between how PhotonVoice and the new Meta Avatars access the microphone. When the PhotonVoiceNetwork is initialised it will eventually call Microphone.Start() in order to get at the microphone's input. However in doing so it creates a new audio clip which makes the audio clip that the meta avatar's LipSyncInput was dependent on no longer equal to the audio coming through the microphone. If I then call LipSyncInput.StartMicrophone() it will also eventually call Microphone.Start() which creates a new AudioClip making the audio clip that Photon's recorder was dependent on no longer equal to the audio coming through the microphone. In short, I can either get lip syncing using my mic input OR photon using my mic input, but not both at the same time.1.6KViews0likes2CommentsControl movement of avatar using GPS data
I have an event which is both in horizons and at a physical venue. If I create a duplicate of the venue in Horizon world's could I track visitors phone GPS locations or give them trackable wristbands, then communicate this location data into Horizon to move their avatar around the 3D map?749Views0likes0Comments