Need help using preset avatars from the Meta Avatars SDK
Device: Quest 3 I have to run an experiment for my project that requires me to assign a participant an avatar from a list of avatars. Once the participant selects an avatar from the list, the full body avatar should appear in front of them from a third-person perspective (3PP), and they should also be able to see the avatar's hands superimposed on their own (1PP, this is important). Both the 3PP and 1PP avatars should mimic the user's hand and head movements. So far, the existing MirrorScene is working for me with Passthrough and a Grab cube interaction. https://developers.meta.com/horizon/documentation/unity/meta-avatars-samples#mirror-scene . I can see the avatar in 1PP and 3PP. I need to know how I can swap out this avatar for another. I believe the current avatar is a fallback avatar. NOTE: I want to use the existing presets available for Quest. There are 33 preset avatars labelled as 0_quest, 1_quest...32_quest. I will only be using a handful of avatars from these presets. I am NOT using avatars from users' Meta profiles. P.S: I gave Mixamo a shot, but it's kinda time-consuming to make each joint work as I want, and making the Meta XR Interaction SDK work with Mixamo models is a pain. TLDR: I want to swap the avatar in the MirrorScene with other presets available locally. How do I do it? First-person perspective (1PP) avatar is crucial. I can't find many resources to guide me with this. https://developers.meta.com/horizon/documentation/unity/meta-avatars-overview Just FYI, I am familiar with coding but have limited knowledge of Unity and Quest.89Views0likes1CommentUnity + OpenXR + OVRSpatialAnchor + Passthrough = Wobbly models
Previously we were using the Oculus XR Plugin with OVRSpatialAnchors and passthrough and all was working as intended. We needed to switch from the Oculus XR Plugin to OpenXR (for a separate package) and managed to get everything working except for when we put an OVRSpatialAnchor on a model with passthrough active, the model appears to be very wobbly compared to it's intended stable state. Once the spatial anchor has been removed off the model it appears much more stable but obviously doesn't have the other characteristics of having a spatial anchor that we need. Does anyone know how to fix this? I saw this post with a similar issue and I updated the AndroidManifest.xml via the Oculus tools menu but that did not seem to fix it. Potentially relevant details: Unity 2020.3.36 MRTK v2.7.2 OpenXR plugin v1.5.3 Oculus XR plugin v1.12.0 Oculus Integration v53.1 Minimum Android API Level: Android 10 (API level 29) OpenXR settings: Render mode: Single Pass Instanced/Multi-view Depth Submission Mode: Depth 16 Bit Interaction Profiles: Oculus Touch Controller Profile OpenXR Feature Groups: Hand Tracking MetaXR Feature Oculus Quest Support I have tried turning off passthrough and saw that the models became stable again but that is not acceptable. I tried updating the AndroidManifest.xml as stated in the linked post but that did not change anything. My code hasn't changed going from the Oculus XR Plugin to the OpenXR Plugin but it can be summed up to just adding the OVRSpatialAnchor component: model.AddComponent<OVRSpatialAnchor>();2.6KViews0likes2CommentsPC VR - Passthrough AirLink - OpenXR
Hi there, I don't have much experience with OpenXR. I was wondering just briefly, is it possible to use passthrough via the OpenXR extension (XR_FB_passthrough) when using a Quest 2 Air-Linked to a PC. So the app will be built and running on the PC, the client HMD is the Quest 2. My issue is as the app won't be running natively on the Quest 2, can it still access the necessary camera via AirLink ? Or is it only possible to use this when developing for directly for Mobile/Android apps running on the Quest 2 directly ? Thank you.11KViews2likes10CommentsUnity Meta OpenXR does not work well with Oculus Link
We are working on a Unity MR application for the Quest 3 using the new OpenXR Meta Feature Group (Unity OpenXR: Meta | Unity OpenXR Meta | 1.0.0 (unity3d.com)), we hereby use the XR Interaction Toolkit to create an app using hand-tracking and passthrough. We made android builds, and those work fine. We are however running into 2 major issues 1. When using Android as the build platform, Oculus Link is not activated so we cannot test from editor 2. When using Windows as the build platform, Oculus Link works, but the eyes are extremely corss-eyed, and passthrough is not enabled (as its not available for windows). Is there anyway to run/test the app from the editor with passthrough? Preferably using android as the build platform? Thank you very much for any support you can provide1.2KViews1like0CommentsPassthrough over Link not working after switch to OpenXR Plug-In
I am using Unity version 2021.3.14, Oculus integration v.51 and OpenXR 1.7.0. I was on the OculusXR Plug-In before and switched to the OpenXR-Plug-In. Before, Passthrough over Link worked, it was rendered in the classes when I activated the Playmode in Editor. Now, it gives me this error and does not render Passthrough when I start Playmode: "Failed to initialize Insight Passthrough. Passthrough will be unavailable. Error Failure_OperationFailed." Also, albeit probably unrelated, I am getting this error when stopping the Playmode since I have switched to the OpenXR-Plug-In: "InputSubsystem not found." Please help, it is very difficult to debug Passthrough-related things if I can only see the Passthrough in builds.2.3KViews1like1Comment