Finding lost controllers
Wouldn’t it be a great idea to add a feature on the meta horizon app where you can just click a button and it makes a noise in our controllers so we can find our missing controllers like me and I’m sure, many other people would be happy to spend extra money for a controller that can be found by sound. (I don’t know where else to say this so I said it here)Solved208Views1like1CommentControllers suddenly flipped on the Z-axis for no apparent reason, and I have no idea why
I’m making an action game, and a few days ago Unity suddenly started throwing errors and wouldn’t open for no apparent reason. After I deleted Meta’s SDK, Unity launched normally again, and reinstalling the SDK didn’t cause any further issues. However, yesterday I noticed that both controllers in my game had suddenly flipped 180 degrees on the Z-axis (see Image 1). This only happens during actual device testing — it does not happen inside Unity (see Image 2). I tried flipping the controllers’ Z-axis by 180 degrees in Unity to cancel out the issue, but it had no effect. I haven’t touched any packages in the past month, and everything was still normal about a week ago. Currently, my Meta SDK version is 81.0.0, OpenXR Plugin version is 1.15.1, and Unity OpenXR Meta version is 2.2.0.Solved200Views1like4CommentsMeta XR Simulator Standalone Help
I'm an educator teaching Unity & XR development using Quest 3 and Meta Building Blocks. But I have been really struggling because of the difference in learning materials online (from Unity's end, and from content creators online from even just 3 months ago, let alone 1 year). The most current / pressing issue in my class is the lack of updated documentation and examples using the new standalone version of the Meta XR Simulator. Half the documents in the official Meta XR Simulator Overview documentation are from 2024 and use the old interface (which had WAY more features and customization options). I have a bunch of students relying on mouse and keyboard controls trying to test behaviors like the locomotion building block, but they don't work. Current issues I would love suggestions or hints on how to solve (from just importing Building Blocks into a Unity Core 3D scene, nothing customized yet): I have duplicate controller models and ghosting (only in the simulator, visible when moving) I have weird graphical glitches sometimes that look like snow or fuzz (only in the Unity game view & simulator when running the simulator) I cannot get rays or aiming reticles to come from the controllers no matter where they or my mouse are pointing (but they work in the headset). Even with point and click on. Do the movement inputs (default WASD and Arrow Keys) simulate the left and right joystick? Or do they override/bypass those inputs? Some teleport control options involve aiming and pressing up on the joystick, and I'm not sure how to test that in the simulator Is there some way to add simulation input options that actually trigger the controller's inputs like the Unity package version used to? I would also appreciate any general advice or resources on new/recent best practices, customization options, and debugging tips using the building blocks and interaction SDK.144Views0likes1CommentHow to show hands while using controller input to grab objects (Meta All-in-One SDK)
Hi, I'm currently developing a VR experience using Unity (2022.3.62f1) and the Meta All-in-One SDK(ver. 78.0), where users are physically holding controllers, but hands are shown instead of controllers — so that interactions appear to be done with hands, even though the input is coming from the controllers. To support this, I've enabled controllerDrivenHandPosesType so that the virtual hands animate based on controller input. Additionally, I configured the controllerButtonUsage property of the GripButtonSelector inside the ControllerGrabInteractor (under [BuildingBlock] OVRInteractionComprehensive > Left/Right Interactions) to use the Primary Button, allowing users to grab objects via button press. However, in this setup, the interaction still seems to be treated as hand-tracking-based. As a result, the HandGrabInteractable component (set via [BuildingBlock] HandGrabInstallationRoutine) follows the Pinch Grab Rules — so the user must still pinch specific fingers to grab an object, even though they're using controller input. What I’m trying to achieve: Show hands instead of controllers Allow grabbing with a controller button (e.g., Primary Button) Disable or bypass pinch gesture requirements My Question: Is there a supported way to simulate hand-based interactions visually, but use controller buttons to grab, without requiring pinch gestures? If there’s a recommended approach, workaround, or best practice to achieve this using the Meta All-in-One SDK, I would greatly appreciate any guidance. Thanks in advance!138Views0likes1CommentAccessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.134Views1like0CommentsUnable to use Meta XR Simulator more than 3 minutes
Hi, I’m trying to test the Meta XR Simulator in order to develop a VR experience in Unity. The Unity package requires the headset to remain connected to my PC so that I can test VR without wearing the headset, and so the controllers can be linked to the simulator (I’m not sure why they can’t be paired via Bluetooth like Switch Joy-Cons when developing a Switch game in Unity). The workflow is essentially the same as testing in Play Mode with the headset on — but why not I guess. On the headset, the XrSimDataForwardingServer.apk is running, and I’ve covered the proximity sensor with black tape to prevent it from entering sleep mode. With this setup, I’m able to use the controllers for about 2–3 minutes, but then the headset still goes to sleep. Because we are a school, we use a third-party MDM to manage all our headsets, so this device is in Shared Mode. As a result, I get the “Continue the session?” popup. Even when I press the button to continue, it simply loops back to the same popup indefinitely. In our MDM, the headset is configured to sleep after one hour, but since the device detects no movement (even though I’m actively using the controllers in Play Mode with the Meta XR Simulator), it still enters sleep mode. How can I resolve this in order to develop my VR experience more efficiently? Thank you in advance for your help.128Views1like2CommentsOTA Zip meta quest 2
Hi, I'm repairing a Quest 2. The headset boots to the Meta logo then shows a flashing grey screen. ADB sideload mode works but the OS is corrupted. I need the official Quest 2 OTA recovery package to reinstall the firmware through adb sideload. Serial number: 1WMHHB6APM2292 Model: KW49CM (128GB) Where can i find the correct OTA zip? Please send the correct signed update zip112Views0likes0CommentsOculus fork of Unreal Engine (version 5.5.4 V78) + Meta Haptics SDK
Hi everyone, I'm currently using the Oculus fork of Unreal Engine (version 5.5.4 V78) and I need to add support for the Meta Haptics SDK. My Unreal build was compiled as an 'Installed Build' to facilitate internal distribution to our production team. The issue is that I cannot find the plugin version for V78 on the official download page (https://developers.meta.com/horizon/downloads/package/meta-xr-haptics-sdk-for-unreal/). Only V74 and V81 seem to be available. When I try to use either of those, I get the standard error: "The following modules are missing or built with a different engine version" What is the best path forward here? Is it possible to provide a V78 version of the plugin? Or will I be forced to recompile the entire engine just to include this plugin? Thanks in advance.103Views0likes1Commenthand tracking for quest 3 on PC
i recently bought the quest 3 and was excited for hand tracking, however i also got a nice new PC for better experience but the hand tracking feature is not available when using with PC so this makes me sad, we need a nice new update for this to be possible asap thank you all so much89Views0likes1Comment