Meta Interaction SDK Version Mismatch in UE 5.7.4 (Requires MetaXR 1.233.0 but found 1.201.0)
I am currently working on a VR project using Unreal Engine 5.7.4 and am running into a version mismatch error between the Meta Interaction SDK and the Meta XR Plugin. The Issue: Upon launching the editor, I receive the following warning: “Meta Interaction SDK version '1.201.0' (v201) requires MetaXR Version '1.233.0' (v201) but found '1.201.0' (v169). Mismatched versions may result in unexpected behavior.” Current Setup: Engine Version: Unreal Engine 5.7.4 (Epic Games Launcher) Meta Interaction SDK: Version 1.201.0 Meta XR Plugin: Version 1.201.0 (reported as v169 by the engine/plugin) Despite both plugins showing as Version 1.201.0 in the Unreal Engine Plugins window, the Interaction SDK specifically requests Version 1.233.0 of the Meta XR plugin to maintain compatibility. Troubleshooting Steps Taken: Verified that both plugins are enabled in the Installed > Virtual Reality section. Updated DefaultBuildSettings to BuildSettingsVersion.V6 and IncludeOrderVersion to EngineIncludeOrderVersion.Latest in my .Target.cs files to ensure they align with UE 5.7 requirements. Attempted to find a standalone download for Meta XR 1.233.0, but the current Meta developer downloads page appears to package them differently. Any help or guidance on how to resolve this version discrepancy so I can utilize the Interaction SDK features safely would be greatly appreciated. Thank you!42Views0likes0CommentsAccessing controller input in Meta Quest system UI overlays from custom Android-based engine?
We are building a custom engine on top of Android (using the SDK and Gradle, not Unity or Unreal) porting our products and targeting Meta Quest devices (Quest 3 specifically) for the 2D System panels / overlays (the floating windows in the Quest home environment): What we want: We need to capture *any form of controller input* (buttons, joystick, scroll) while the apps system UI panels are visible or focused. Key constraints: We are not trying to modify system UI, only observe or intercept input Even partial input (e.g. scroll events, pointer movement, or indirect signals) would be sufficient What we’ve tried: Standard Android input APIs (onKeyDown, onGenericMotionEvent) Checking for MotionEvent sources from controllers Polling input devices directly Combed through the SDK without luck Is there any supported way (SDK, API, or lower-level approach) to access or intercept controller input when system UI overlays are active on Meta Quest? Any guidance on how input routing works between apps and system UI on Quest would be helpful.64Views0likes3CommentsCan Meta XR SDK build for Windows PC VR, or is it Quest-only?
I've been developing a VR game for Quest Pro/Quest 3 for the past year using Unity 6 (6000.0.40f1) with Meta XR SDK (Core 74.0.1, All-in-One 74.0.2, Interaction SDK 74.0.2, Essentials 74.0.1). Current situation: Standalone Quest APK builds work functionally Performance is poor (~40-50 FPS, pixelated visuals) Unity Editor with Quest Link runs beautifully (80+ FPS, crisp visuals) What I need to know: Can I build a Windows platform executable with Meta XR SDK and run it as a PC VR app via Quest Link? Or is Meta XR SDK strictly for standalone Quest Android builds? What I've done: Optimizing the standalone build with my own code logic and texture compression, and logging disabled (My game has a server logging feature) Limited improvement due to an asset-heavy project Why I'm asking: Since Quest Link (Editor to Quest) performs so well, I'm wondering if I can build a Windows .exe that runs the same way - using my PC's GPU while the Quest acts as a display/input device. Constraints: I need Meta XR SDK specifically for eye tracking (Quest Pro) Not using OpenXR due to reported conflicts with Meta XR SDK Has anyone successfully built Windows PC VR apps using Meta XR SDK? Or is the SDK Android-only, requiring a switch to OpenXR/SteamVR for PC builds? Any guidance or documentation links appreciated! Other questions: Does eye tracking work over Quest Link with Windows builds? Are there specific build settings or plugins needed? Any performance differences vs standalone?Solved61Views0likes4CommentsRecord and replay real hand pose at runtime (Meta Interaction SDK – Unreal)
Hi everyone, I’m currently working with the Meta Interaction SDK in Unreal Engine (UE 5.6) and using hand tracking only (no controllers). I’m using the ISDK Hand Rig Component for both hands. What I’m trying to achieve is: I want to capture the user’s real hand pose at runtime, save that pose, and then reapply (replay) that exact pose later on command. So, basically my requirement is, Is there a built-in way in Meta Interaction SDK to record and replay hand poses ? What’s the best way to temporarily override hand tracking and apply a custom pose? Any guidance, suggestions, or pointers would be really helpful. help in any way you can. Thanks.38Views0likes1CommentIs there a way to make the player not see their own avatar
Hello, I am building a Unity Project with Meta All-In-One SDK and I'm using the Networked Avatar Building Block on top of Matchmaking and Hand Tracking and Passthrough to create an experience where users can see other avatars with their hand movements with Passthrough in the real world , this creates an effect where you can see people walking around, talking and interacting with things in a room that they are not physically in with passthrough. My issue with this is the fact that the player (host) can see other players' Avatar and other players can see the host's avatar but when the host themselves also look down they also see their own avatar arms connected to their hands and body. I do not really want this Is there way for the player to just see their normal hand prefabs/meshes from their perspective while other connected players see the full avatar and vice versa?21Views0likes1CommentPractical Hand Tracking in Unity | Start Mentor Workshop
In this Meta Horizon Start workshop, Start Mentor Valem (Quentin) walks through practical hand tracking in Unity by building and comparing three core interaction models side-by-side: Finger Pinch (including pinch strength values and thresholds) Microgestures (tap + swipes, like an “invisible joystick”) Hand Pose Detection (custom shapes + hand orientation to avoid false matches) You’ll see how to set up a hand-tracking-ready Unity scene quickly using Building Blocks, then implement each input style with simple script patterns and live examples—so you can choose the best interaction method for your own XR app on Meta Horizon OS. This session was recorded in March 2026 as part of the Meta Horizon Start program. 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00 - Welcome & workshop goals (comparing 3 hand tracking input models) 📋 PINCH (FINGER PINCH STRENGTH) 🕒 01:10 - What finger pinch is (and why index pinch is most reliable) 🕒 03:05 - Reading pinch strength (0–1) + using thresholds 🕒 05:10 - Interaction SDK examples (grab + ray select) 🕒 07:15 - Build demo: scene setup with Building Blocks + pinch-to-change-color spheres 🕒 10:50 - Smooth vs threshold pinch comparison + reliability notes (index vs other fingers) 📋 MICROGESTURES (INVISIBLE JOYSTICK) 🕒 14:10 - What microgestures are (tap + swipe directions) + use cases 🕒 16:00 - Reading microgesture type in code (switch/cases) 🕒 18:10 - Demo: show detected gesture in TextMeshPro 🕒 20:10 - Note: Interaction SDK locomotion uses microgestures (turn/teleport) 🕒 22:20 - Examples: character movement + UI navigation (and fixing missing EventSystem) 📋 HAND POSE DETECTION (SHAPE + ORIENTATION) 🕒 27:10 - Hand pose detection concepts: shape + orientation 🕒 28:40 - Create a custom Shape (thumbs up) using curl settings 🕒 31:10 - Trigger events when pose is recognized (active state → Unity events) 🕒 33:10 - Fixing thumbs up vs thumbs down: add wrist orientation (Transform Recognizer) 🕒 36:00 - Build thumbs down variant + wrap-up (choosing the right input model) 📚 RESOURCES ➡️ Meta Horizon Developer Forum: https://communityforums.atmeta.com/category/horizon-developer-forum ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with hands-on support and expert guidance to accelerate app development. Join a thriving community to access the tools and go-to-market resources you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
33Views0likes0Comments(Unreal Engine) Pinch doesn't work properly in multiplayer.
It's version 5.5.4 and sdk version is being developed as 78.0. I'm developing a case where 3 people multi-play, and different Pawn can be set up each player. For example, one person is DefaultPawn, the other is IsdkSamplePawn, IsdkSamplePawn2, and so on. The two Pawn's are using hand-tracking. The code link that I referenced to make different Pawn is like https://unreal.gg-labs.com/wiki-archives/networking/spawn-different-pawns-for-players-in-multiplayer and this is Unreal 4, so I modify the code to version 5. By the way, if the Player Controller class calls GameMode's RestartPlayer within DeterminePawnClass, Pinch grab action does not work in handtracking. I think it's a bug of RigComponent in InteractionSDK, but I wonder if anyone has solved this problem.Solved165Views0likes3CommentsHow to Set Up Hand Tracking with the Meta All-in-One SDK | Mentor Workshop
If you're developing for Meta Quest and want reliable hand interactions, this is your starting point. In this session, Start Mentor Quentin Valembois (Valem) walks through the All-in-One SDK from foundational setup to advanced features, covering Building Blocks and Quick Actions before moving into gesture inputs and custom throwing physics. Plus, stick around until the end to catch the live Q&A featuring a member of Meta's Input Framework team. 🎬 CHAPTERS 🕒 00:00: Introduction 🕒 01:13: Setting Up Hand Tracking 🕒 08:41: The Interaction SDK 🕒 14:48: Automating Setup with Quick Actions 🕒 22:08: Triggering Inputs with Hands 🕒 26:29: Advanced Locomotion & Physics 🕒 33:22: Q&A with Meta's Input Framework Team 📚 RESOURCES 🔗 Featured GitHub Project: https://github.com/Meta-Horizon-Start-Program/MasterHandTracking 📺 Meta SDK V83: New Hand Tracking Locomotion Features: https://www.youtube.com/watch?v=V5BudZA9b9Q ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
183Views0likes0Comments