QR code recognition is not working properly on the new version of Meta Quest.
I recently updated my Quest 3 to version v83, and since then, the app that handles QR code recognition has stopped working correctly. I tried reinstalling the app and a few other basic troubleshooting steps, but it still fails to recognize any QR codes. Is anyone else experiencing the same issue with v83? If someone has found a solution or workaround, I would really appreciate hearing how you resolved it.Multiple OVRInteractionComprehensive building blocks in project
I'm using the OVRInteractionComprehensive building block as the basis of my project with a distance grab block. I then added the HandPose Features to use 2 preset hand poses (Rock and stop) and this automatically added a second OVRInteractionComprehensive block to manage these poses. I've now added a basic menu but can't get either ray interaction or poke interaction to function on the menu buttons. It seems the previous two blocks already in use in the project are blocking the event system from working. When I add a canvas it won't add an event system. Can anyone suggest the best way forwards and point me at a tutorial that might help. Thanks!Solved27Views0likes1CommentIssue with Meta Interaction SDK v74 on Oculus-5.5-v74 Unreal Engine Fork
Issue Description I downloaded the Meta Interaction SDK (version 74) and integrated it into the Unreal Engine fork oculus-5.5-v74 from GitHub. The integration works correctly inside the Editor, and VR Preview runs without any issues. However, when I package the project for Quest and install the APK on the headset, the application: launches, shows a black screen for less than a second, then immediately crashes/auto-closes. When I disable the Meta Interaction SDK plugin and rebuild the project, the packaged application works normally on the device. This confirms that the crash is directly caused by the Interaction SDK during runtime on the Quest. What I Have Already Verified The plugin is properly added under /Plugins/runtime/OculusInteraction. Engine fork version matches the Meta SDK version (both v74). No conflicts in Editor logs. No errors during packaging. The crash happens only on device startup, not in VR Preview. Expected Behavior The packaged app should run the same way it does inside VR Preview. The Interaction SDK should not crash the application on Quest when the correct version is used. Request Could you please advise on: Whether Meta Interaction SDK v74 is fully compatible with the fork oculus-5.5-v74-openxr-asw-ffr. Any additional setup required to ensure correct initialization on Quest (permissions, XR loader, manifest changes, MetaXR modules, etc.). Whether this is a known issue for v74 of the SDK. If this fork requires a specific Interaction SDK variant different from the public one.36Views0likes2CommentsQuest 3 OpenXR: Hand Tracking, Vive Ultimate Tracker — Need Both Together
In my project, I am trying to achieve object tracking. I want to place the Vive Ultimate Tracker on a real object while also tracking the user’s real hands on the Quest 3 using OpenXR. The issue I’m facing is that when I use SteamVR Link, the tracker works correctly using the HTCViveTrackerProfile (as shown above) and the Tracked Pose Driver, but the real hand tracking does not work. However, when I use Meta Quest Link to run the same scene, the real hand tracking works, but the Vive Ultimate Tracker does not track. I want to combine both systems so that hand tracking and the Vive Ultimate Tracker work at the same time, and I need a clear conclusion on how to achieve this. Or give me another way to track the real time object and hand tracking9Views0likes0CommentsResetting the rotation of a OneGrapRotationTransformer
In case someone is struggling to reset the rotation of a Transform which is controlled by a OneGrabRotationTransformer, the only way I've found to achieve that is what's done in this post: Reset a "One grab rotate transformer" | Meta Community Forums - 1269247. TLDR; Create your own OneGrabRotateTransformer, and implement the login in it's EndTransform-method, where you'll be able to fully reset the internal state of the transformer. public void EndTransform() { var targetTransform = _grabbable.Transform; targetTransform.localEulerAngles = Vector3.zero; _relativeAngle = 0.0f; _constrainedRelativeAngle = 0.0f; } I'm hoping this gets support in the SDK...SolvedGetting Analog Values from Joysticks in Spatial SDK Native Android App
Hi, I am trying to create an app that can track the joystick push values as continuous values rather than discrete pushed/not pushed. I am using the Spatial SDK and it seems to only expose getters such as ButtonThumbLL which return an Int representing whether they were pressed or not. I saw a similar post about 4 months ago that was left unanswered: Interface of getting Controller joystick travel range | Meta Community Forums - 1336257. I am curious if anyone knows and idiomatic way of getting the continuous values of how far the joysticks are pushed. Thank you.140Views2likes6Comments(Unreal Engine) Pinch doesn't work properly in multiplayer.
It's version 5.5.4 and sdk version is being developed as 78.0. I'm developing a case where 3 people multi-play, and different Pawn can be set up each player. For example, one person is DefaultPawn, the other is IsdkSamplePawn, IsdkSamplePawn2, and so on. The two Pawn's are using hand-tracking. The code link that I referenced to make different Pawn is like https://unreal.gg-labs.com/wiki-archives/networking/spawn-different-pawns-for-players-in-multiplayer and this is Unreal 4, so I modify the code to version 5. By the way, if the Player Controller class calls GameMode's RestartPlayer within DeterminePawnClass, Pinch grab action does not work in handtracking. I think it's a bug of RigComponent in InteractionSDK, but I wonder if anyone has solved this problem.Solved84Views0likes2CommentsMeta XR Simulator does not appear
Pullin my hair out here - I have the V81 simulator installed on Windows 11. Unity 6.2.10f1. I enable the simulator but the window does not appear. Any suggestions on what to check? My goal is to use the headset in editor and the simulator in the Multiplayer play mode. When using XRIT and the unity xr simulator in another project, this type setup works quite well . I can have 3 virtual players using the simulator and I can use the actual headset over the pc link in the editor. I want to replicate this type of setup but using the Meta XR Simulator instead.Solved180Views0likes7CommentsOVROverlayCanvas with Passthrough Broken
I was playing around with an app I made and decided to turn it into an XR app with passthrough capabilities and ran into issues. Passthrough: this works as it should I can see the video feed. UI: I had OVROverlayCanvas enabled on the main menu. This is where things get wonky. With the OVROverlayCanvas component set to underlay as the default and no passthrough enabled everything works on the device (editor is another story as it only shows in the left eye and will not show in the right eye no matter what I try, even with a simple canvas. When I turn on passthrough the UI shows but only in my peripheral vision, but if I look directly at it I cannot see the UI anymore it just disappears. Switching to Overlay instead of Underlay it is always visible, but I cannot figure out how to get the hand ray cursor to appear over the top of the imposter UI. Does anyone know how to get either of these to work?106Views0likes4CommentsHow Improve Hand Tracking Pinch Accuracy
I am creating a system that uses pinch movements of the thumb and each finger, but the pinch accuracy is not good. In particular, there are issues with pinch accuracy between the ring finger and pinky finger, and with the pinch of the middle finger and ring finger responding simultaneously. Please let me know if you have any suggestions for improvement47Views0likes1Comment