Hand Gesture for intiating the Menu with the left hand (Start Button) no longer triggers after v66
OVRButtonActiveState and listening for the START button and even using OVRInput.GetDown(OVRInput.Button.Start) no longer gets triggered with the hand gesture pinch using the left hand. Controller works fine so this appears to be another issue with the updated Interaction SDK v67 and v68. Could someone confirm this is being worked on?1.3KViews1like5CommentsAccessing GameObject associated with Interactor from an Interactable Unity
I am using the Meta XR Interaction SDK in a Unity mixed reality app. I have cube setup with a SnapInteractor and aa snap point with a SnapInteractable. I have code running for the "When Select" event with an InteractableUnityEventWrapper, but I need to reference the Interactor that has been snapped to the Interactable. I don't see a way to do this in the documentation. What am I missing / is there a reliable work around if it is not officially supported? ThanksSolved4.2KViews1like5CommentsOculus.Interaction.HandGrab.HandGrabInteractable.Colliders needs a refresh feature?
Hello, I had this case where the parent has the HandGrabInteractable component and then add some children colliders at runtime, they are also HandGrabInteractable. Now if I delete those children at runtime (which I need in the game) it doesn't remove those children colliders from the parent HandGrabInteractable.Colliders[] and as soon as you put your hand inside that parent colliders you get missing collider error. so it didn't update the Colliders[] when destroyed. The only fix I found was to add: public void RefreshColliders() { Colliders = Rigidbody.GetComponentsInChildren<Collider>(); } in HandGrabInteractable.cs and call it in my fixedUpdate in the case a children collider has been deleted. But I will need to change that each time i update the sdk so it's not ideal. Is there a cleaner way that I missed to achieve the same result? Right now the Colliders are only set in the Start() function of HandGrabInteractable, in my opinion it should subscribe to those children and refresh when they got destroyed. Thanks for your consideration334Views0likes0CommentsMeta XR Interaction SDK Samples: Once you touch the Right controller, the right hand as issue
Hello, Meta XR Interaction SDK OVR Samples 65.0.0 Unity 2022.3.31f1 Quest 3 In the Sample Scenes : HandGrabExamples & HandGrabUseExamples You can see the same wrong behavior: Once your right controller becomes active by picking it up and grabing something, once you put it down you cannot use your right hand anymore to grab. The poke interaction is still working tho! The combinaison left hand/left controller doesn't show that issue, it behaves normally. I'm quite sure it's stupid, I tried to find the difference between the right and the left without any success. For all of those examples it's only appearing in Play test in unity (over link cable) once I built, it works. I can't grab anything with my right hand since yesterday in my game... Another issue I have is that when using concurrent hands et controller I can see when pinching with the right hand the A key beeing pressed on the synthetic right controller, how do you turn that off? Thanks a lot for your help.1KViews0likes1CommentMultimodal without both controllers (using only one controller)
When testing the `ConcurrentHandsControllersExamples` example scene I'm experiencing an issue where if a controller isn't on and connected the controller model renders in my hand where i would expect to see my tracked hand. Once i turn that controller on the detached controller and my tracked hand appear as expected. Is it required that both controllers be on and connected to "properly" use multimodal mode? I have a usecase where I'm using a controller as a tracker on a larger object but still want to use hand tracking to interact with UI elements and this disables hand tracking on the affected hand. I don't see any mention of this requirement or behavior in https://developer.oculus.com/documentation/unity/unity-multimodal/ ,I do see a section describing "Single controller game play" where i can use only one controller but it doesn't mention that the unused controller must be on and connected.466Views1like0CommentsHow to set cursor size for a pointable canvas? Meta XR Interaction SDK
I am experimenting with VR UI using the Meta Interaction SDK samples. After creating a Pointable Canvas and placing it 0.5m in front, as shown in the attached image, the pointer becomes too large and obscures the UI. I have looked through the components, but I cannot figure out how to appropriately adjust the pointer size. If anyone knows, I would appreciate your guidance.465Views0likes0CommentsIssues with APK Build Rendering and Body Tracking on Quest 3 in Unity
Hello everyone, I'm currently working on a project in Unity for the Meta Quest 3, and I'm facing several challenges that I hope some of you might be able to help me address. Rendering Issues in APK Build: When I run my scene directly from Unity on my PC connected to the Quest 3 via Link, everything renders correctly. However, when I create an APK and run it on the Quest 3, certain elements like water and body models do not render properly. It seems as though the body model is not visible and the water in the scene does not display at all. Body Tracking: Alongside rendering issues, the body tracking does not function as expected in the APK build. When connected to the PC, the body tracking works fine, but this functionality breaks down in the standalone APK. Configuration Details: Unity Version: Unity 2021.3.26f1 XR Plugin Management settings are configured for Oculus. I've tried various settings for Android texture compression and multi-view rendering. My scene includes body tracking and uses the OVR plugins. I've attached the settings I'm using for Android build and XR plugin management for reference. Has anyone else encountered similar issues or can offer any insights into what might be going wrong? Any suggestions on how to ensure the APK build maintains fidelity with the editor preview would be incredibly helpful. https://youtube.com/shorts/BzQq_aj-mfo Thank you in advance for your help!525Views0likes0CommentsHand Tracking not working in release
Hi- I've setup my project with controllers as hands as per https://developer.oculus.com/documentation/unity/unity-isdk-interaction-sdk-overview/ and when run in development mode, all of my interactables work great. If I build in release mode, however, none of the interactables work. Would anyone have any ideas as to why that can occur? Thanks,657Views1like0CommentsUnable to upload to AppLab. Error This APK contains libInteractionSdk.so and targets Android SDK 31
I am trying to upload a build to AppLab and getting error "This APK contains libInteractionSdk.so and targets Android SDK 31. This is currently unsupported; please set the target Android SDK to 29 or 30, or remove the Interaction SDK and try again." I am using Unity 2022.2.6f1 and Oculus Integration v50 (also tried with v49 with same error). Anyone else having the same issue with Unity 2022.2? Will try an earlier version of Unity but can't find out which Unity versions that uses Android SDK 29 or 30.3.2KViews0likes6CommentsHow to force Avatar Hand to blend as Interaction Rig's hand when grabbing object
I'm trying to have the Avatar hand (from avatar sdk) to shape as the interaction rig hands when grabbing object. For example, in the samples scenes if I put the avatar in the TouchGrabExamples scene when i grab an object the interaction hand actually stop at the collider position while the avatar's one go all the way through it. Someone have an idea for solve it?982Views4likes1Comment