Need help using preset avatars from the Meta Avatars SDK
Device: Quest 3 I have to run an experiment for my project that requires me to assign a participant an avatar from a list of avatars. Once the participant selects an avatar from the list, the full body avatar should appear in front of them from a third-person perspective (3PP), and they should also be able to see the avatar's hands superimposed on their own (1PP, this is important). Both the 3PP and 1PP avatars should mimic the user's hand and head movements. So far, the existing MirrorScene is working for me with Passthrough and a Grab cube interaction. https://developers.meta.com/horizon/documentation/unity/meta-avatars-samples#mirror-scene . I can see the avatar in 1PP and 3PP. I need to know how I can swap out this avatar for another. I believe the current avatar is a fallback avatar. NOTE: I want to use the existing presets available for Quest. There are 33 preset avatars labelled as 0_quest, 1_quest...32_quest. I will only be using a handful of avatars from these presets. I am NOT using avatars from users' Meta profiles. P.S: I gave Mixamo a shot, but it's kinda time-consuming to make each joint work as I want, and making the Meta XR Interaction SDK work with Mixamo models is a pain. TLDR: I want to swap the avatar in the MirrorScene with other presets available locally. How do I do it? First-person perspective (1PP) avatar is crucial. I can't find many resources to guide me with this. https://developers.meta.com/horizon/documentation/unity/meta-avatars-overview Just FYI, I am familiar with coding but have limited knowledge of Unity and Quest.121Views0likes1CommentCross Post: Issues with v71 and vertex shader compilation in OpenGL
https://communityforums.atmeta.com/t5/Get-Help/Issues-with-OpenGL-Vertex-Shaders-on-v71/m-p/1259298#M340899 I may have posted the original question in the wrong place. I have an engine with OpenXR implemented natively.896Views1like0CommentsHand Tracking Issue with OpenXR on Vive Focus 3 and Meta Quest 3
Thank you for taking the time to read my question. I am currently developing content using hand tracking with OpenXR, but suddenly, hand tracking has stopped working. When checking the Input Debugger, it used to show “XRHandDevice” for hand values when hand tracking was functional. However, it now shows “OculusTouch” as seen in the attached image. Even when creating a new project, it does not display “XRHandDevice,” and hand tracking is not available. If anyone has encountered this issue or knows a solution, I would greatly appreciate your help. Thank you in advance! Feel free to adjust it if needed! If you have any other questions or need further assistance, just let me know.1.1KViews0likes0CommentsAccleremoter Sampling Frequency
Hi, For an app that I created, I was wondering what is the maximum sampling rate that I can achieve from the IMU sensors (accelerometer and gyroscope). I am currently using a native app that Meta has published hello_xr running it on Android Studio and logging the data in logcat. However, this has given me a 70Hz sampling rate. What is the highest possible sampling in Quest 2 or 3? How can I achieve this with my app?1.2KViews2likes1CommentWebXR: Meta Quest Browser crashes when exiting VR or navigating link
With the most actual version of the Meta Quest Browser (32.1) I currently face the following WebXR issues : Prerequesite: User has opened a site browser in VR mode When VR mode is left programatically (e.g. by xrSession.end()), the browser crahes When trying to navigate to another site (e.g. ) the browser does crash as well I tested this with several WebXR frameworks and had the same issue -> I therefore assume that it is a bug in Meta Quest Browser. This definitely has worked with previous versions of the browser. Example: https://playground.babylonjs.com/#C7HSTA#5 P.S.: If there is a better place to post such bug reports, please let me know 🙂1.3KViews2likes1CommentBuild and Run hello_xr Sample App Error
PS D:\OpenXR-SDK-Source-main\build\win64> cmake -G "Visual Studio 17 2022" -A x64 ..\.. -- Selecting Windows SDK version 10.0.19041.0 to target Windows 10.0.22631. -- Enabling OpenGL support -- Could NOT find Vulkan (missing: Vulkan_LIBRARY Vulkan_INCLUDE_DIR) -- Could NOT find JsonCpp (missing: JsonCpp_INCLUDE_DIR JsonCpp_LIBRARY) D:/ovr_openxr_mobile_sdk_62.0/OpenXR/Libs/Android// -- Enabling OpenGL support in hello_xr, loader_test, and conformance, if configured -- Could NOT find glslc, using precompiled .spv files -- OpenXR 1.0.34 -- Configuring done CMake Error at src/tests/c_compile_test/CMakeLists.txt:30 (add_executable): Target "openxr_c_compile_test" links to target "OpenXR::openxr_loader" but the target was not found. Perhaps a find_package() call is missing for an IMPORTED target, or an ALIAS target is missing? i tried to log ANDROID_ABI but no value How can I fix it?1.7KViews0likes4CommentsSampling Rate of IMU Sensors?
Hi, For an app that I created, I was wondering what is the maximum sampling rate that I can achieve from the IMU sensors (accelerometer and gyroscope). I am currently using a native app that Meta has published hello_xr running it on Android Studio and logging the data in logcat. However, this has given me a 70Hz sampling rate. What is the highest possible sampling in Quest 2 or 3? How can I achieve this with my app?959Views0likes0CommentsOpenXR crash (with OpenComposite) in PTC v59
With a stable branch of the Oculus app on Windows, everything works fine here and I don't have THIS problem. (Except I have a huge amount of colorful pixels, so I use the Beta branch which shows fewer artifacts. But this is not related to this topic.) But there's a problem with PTC Beta: I have an error when running games with OpenComposite. Here's the log :4KViews1like4Comments