Can Meta XR SDK build for Windows PC VR, or is it Quest-only?
I've been developing a VR game for Quest Pro/Quest 3 for the past year using Unity 6 (6000.0.40f1) with Meta XR SDK (Core 74.0.1, All-in-One 74.0.2, Interaction SDK 74.0.2, Essentials 74.0.1). Current situation: Standalone Quest APK builds work functionally Performance is poor (~40-50 FPS, pixelated visuals) Unity Editor with Quest Link runs beautifully (80+ FPS, crisp visuals) What I need to know: Can I build a Windows platform executable with Meta XR SDK and run it as a PC VR app via Quest Link? Or is Meta XR SDK strictly for standalone Quest Android builds? What I've done: Optimizing the standalone build with my own code logic and texture compression, and logging disabled (My game has a server logging feature) Limited improvement due to an asset-heavy project Why I'm asking: Since Quest Link (Editor to Quest) performs so well, I'm wondering if I can build a Windows .exe that runs the same way - using my PC's GPU while the Quest acts as a display/input device. Constraints: I need Meta XR SDK specifically for eye tracking (Quest Pro) Not using OpenXR due to reported conflicts with Meta XR SDK Has anyone successfully built Windows PC VR apps using Meta XR SDK? Or is the SDK Android-only, requiring a switch to OpenXR/SteamVR for PC builds? Any guidance or documentation links appreciated! Other questions: Does eye tracking work over Quest Link with Windows builds? Are there specific build settings or plugins needed? Any performance differences vs standalone?5Views0likes0CommentsRecord and replay real hand pose at runtime (Meta Interaction SDK – Unreal)
Hi everyone, I’m currently working with the Meta Interaction SDK in Unreal Engine (UE 5.6) and using hand tracking only (no controllers). I’m using the ISDK Hand Rig Component for both hands. What I’m trying to achieve is: I want to capture the user’s real hand pose at runtime, save that pose, and then reapply (replay) that exact pose later on command. So, basically my requirement is, Is there a built-in way in Meta Interaction SDK to record and replay hand poses ? What’s the best way to temporarily override hand tracking and apply a custom pose? Any guidance, suggestions, or pointers would be really helpful. help in any way you can. Thanks.16Views0likes0CommentsAccessing controller input in Meta Quest system UI overlays from custom Android-based engine?
We are building a custom engine on top of Android (using the SDK and Gradle, not Unity or Unreal) porting our products and targeting Meta Quest devices (Quest 3 specifically) for the 2D System panels / overlays (the floating windows in the Quest home environment): What we want: We need to capture *any form of controller input* (buttons, joystick, scroll) while the apps system UI panels are visible or focused. Key constraints: We are not trying to modify system UI, only observe or intercept input Even partial input (e.g. scroll events, pointer movement, or indirect signals) would be sufficient What we’ve tried: Standard Android input APIs (onKeyDown, onGenericMotionEvent) Checking for MotionEvent sources from controllers Polling input devices directly Combed through the SDK without luck Is there any supported way (SDK, API, or lower-level approach) to access or intercept controller input when system UI overlays are active on Meta Quest? Any guidance on how input routing works between apps and system UI on Quest would be helpful.16Views0likes0CommentsIs there a way to make the player not see their own avatar
Hello, I am building a Unity Project with Meta All-In-One SDK and I'm using the Networked Avatar Building Block on top of Matchmaking and Hand Tracking and Passthrough to create an experience where users can see other avatars with their hand movements with Passthrough in the real world , this creates an effect where you can see people walking around, talking and interacting with things in a room that they are not physically in with passthrough. My issue with this is the fact that the player (host) can see other players' Avatar and other players can see the host's avatar but when the host themselves also look down they also see their own avatar arms connected to their hands and body. I do not really want this Is there way for the player to just see their normal hand prefabs/meshes from their perspective while other connected players see the full avatar and vice versa?8Views0likes0Comments(Unreal Engine) Pinch doesn't work properly in multiplayer.
It's version 5.5.4 and sdk version is being developed as 78.0. I'm developing a case where 3 people multi-play, and different Pawn can be set up each player. For example, one person is DefaultPawn, the other is IsdkSamplePawn, IsdkSamplePawn2, and so on. The two Pawn's are using hand-tracking. The code link that I referenced to make different Pawn is like https://unreal.gg-labs.com/wiki-archives/networking/spawn-different-pawns-for-players-in-multiplayer and this is Unreal 4, so I modify the code to version 5. By the way, if the Player Controller class calls GameMode's RestartPlayer within DeterminePawnClass, Pinch grab action does not work in handtracking. I think it's a bug of RigComponent in InteractionSDK, but I wonder if anyone has solved this problem.Solved159Views0likes3CommentsMetaXRInteraction missing Ray visuals for ISDKControllers, but ISDKHands working as intended.
I recently upgraded to the version 1.81.0 of MetaXRInteraction on the Oculus Unreal engine fork. Version 5.6.1 And the controller components don't have ray visuals when interacting with menus. The functionality works, I can click, and elements change when hovered, but it's hard to aim. What's weird, is that the ISDKHand components, do have proper ray visuals. (Expect for the right reticle not getting blue when pinching) I am not sure what is broken, or how to get this working. All parameters are defaults. And I only turn off/on the Ray and Grab components (for both ISDK Hands and ISDK Controllers) when the user selects a hand preference in settings using the Set Active function. Video demonstrating the issue. https://drive.google.com/file/d/1GZB-dCXpaxryZP92I9jXAnZO34Lg9WHP/view?usp=sharing Thanks!52Views0likes2CommentsMeta SDK 83+ Teleport mess up
Hi i just wasted half a day on this. On the latest 83 SDK when you add the teleport building block, the controller only slides and does not emit a teleport ray. In previous sdk versions it worked out of the box. Apparently someone decided to change this, this is not mentioned in the documentation. I had to dissect the Interaction example to find this. In order to get back the old functionality, goto into the OVRInteractionComprehensive->...->LocomotionCOntrollerInteractorGroup and enable TeleportControllerInteractor, and or disable ControllerSliderInteractor. This can be done for both Left and Right Interactions. Look at the pic below. I really dont get why this has changed.99Views1like1CommentDistanceGrabUseInteractable?
Hi, This is definitely a pretty basic question relating to the Meta Interaction SDK for Unity. I've managed to get a HandGrabUseInteractable linked to a HandGrabinteractable via a SecondaryInteractionFilter. However, I also have a DistanceHandGrabInteractable on that object, linked to its own HandGrabUseInteractable linked to the same delegate as the first. When I grab the object without distance grab, my script calls BeginUse, EndUse, and ComputeUseStrength properly. When I grab with distance, it does not, as far as I can tell - I am working on Mac and the simulator was not working with this scenario at all, so I have to port the APK to my quest each time I want to test. That takes away a bit of my debugging capabilities. I thought perhaps this was an issue with having multiple HandGrabUseInteractables, but when I removed the duplicate and made the object only have DistanceHandGrabInteractable and one HandGrabUseInteractable, it still did not work. I also wondered if perhaps HandGrabUseInteractable only supports the HandGrabInteractable, and not other HandGrabInteractables? But peeking at the package code and reading the SecondaryInteractionFilter docs seemed to suggest either HandGrabInteractable or DistanceHandGrabInteractable should work, so long as all references are piped correctly. What am I doing wrong? How can I link my DistanceHandGrabInteractable to a HandGrabUseInteractable? Will I need to make my own DistanceGrabUseInteractable script, perhaps using the existing HandGrabUseInteractable as a base? Thanks for the help20Views0likes0CommentsAccessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.134Views1like0Comments