TransformRecognizerActiveState is never activating
I haven't been able to get a Transform Recognizer Active State to register as active. I have no issues with Shape Recognizer Active State. Are there any additional steps that are required in order to get a TransformRecognizerActiveState to work? I have also examined it with an Active State Debug Tree UI and it confirms that the shape recognizer is activating but not the transform recognizer. Here is my component setup: Can anyone see or guess what I might be doing wrong?Solved2.3KViews0likes4CommentsDllNotFoundException: InteractionSdk in samples scene
Hi, On a new project, tried on Unity 2021.3.43 and 2022.3.41 . I install Meta XR All-in-One SDK v68. I do all the project setup. I download the samples scenes. But when i want to launch one (TouchGrabExamples for exemple). I always have the error : DllNotFoundException: InteractionSdk assembly:<unknown assembly> type:<unknown type> member:(null) Oculus.Interaction.Input.Filter.HandFilter.Awake () (at Library/PackageCache/com.meta.xr.sdk.interaction@68.0.0/Runtime/Scripts/Input/Hands/DataModifiers/HandFilter.cs:144) I have looked for hours for a solution to this problem. I have found a lot of forum posts talking about this error but no fix. What can I do to fix this error ? Thank you in advance !1.4KViews0likes5CommentsHand Gesture for intiating the Menu with the left hand (Start Button) no longer triggers after v66
OVRButtonActiveState and listening for the START button and even using OVRInput.GetDown(OVRInput.Button.Start) no longer gets triggered with the hand gesture pinch using the left hand. Controller works fine so this appears to be another issue with the updated Interaction SDK v67 and v68. Could someone confirm this is being worked on?1.2KViews1like5CommentsApp crashes when grabbing attempt using Interaction SDK
I'm using Meta's fork of Unreal Engine 4.5.3 and I am using Interaction SDK. I tried to set up an actor to be grabbed, using the InteractionSDK, but whenever I pinch my object, the app crashes. The documentation is very vague on how to set it up, and I tried to copy the Hierarchy from the sample project, but I don't know what is causing the crash. I'm testing in a packaged app on my Quest Pro. The sample project does work as intended. I just added a BoxCollision to the actor IsdkInteractableWidget, and as a child to the BoxCollision, I added IsdkGrabbable component. As well as a IsdkGrabbableAudio. The widget does work with hand to poke and do raytrace interactions, so I am confident in the IsdkHandRigComponentRight and Left components.Solved1.4KViews0likes4CommentsAccessing GameObject associated with Interactor from an Interactable Unity
I am using the Meta XR Interaction SDK in a Unity mixed reality app. I have cube setup with a SnapInteractor and aa snap point with a SnapInteractable. I have code running for the "When Select" event with an InteractableUnityEventWrapper, but I need to reference the Interactor that has been snapped to the Interactable. I don't see a way to do this in the documentation. What am I missing / is there a reliable work around if it is not officially supported? ThanksSolved4.1KViews1like5CommentsCheck if object is snapped
Hi everybody, I am trying to implment an interaction where the user (1) grabs an Interactable object and snaps it to a predefined position and then (2) has the option to translate transform that same object to a specific position. I did encount a few problems/question that I couldn't solve yet. 1. How can I check, if an object is snapped to a predefined snaplocation? I want to achieve, that I then can deactivate and activate the relevant Grab Transformers and maybe snap locations 2. How can I check, if an object is transformed (through one grab translate transformer) to a specific position and then lock it there while still active? I want the object to be able to be translate-transformed again, but only on a release and grab, not while keeping it grabbed (active). Any help is greatly appreciated. I suppose I have to code a little bit which I don't have a big problem with, however a starting point would be of great help to me. Thank you 🙂1.2KViews0likes4CommentsOculus.Interaction.HandGrab.HandGrabInteractable.Colliders needs a refresh feature?
Hello, I had this case where the parent has the HandGrabInteractable component and then add some children colliders at runtime, they are also HandGrabInteractable. Now if I delete those children at runtime (which I need in the game) it doesn't remove those children colliders from the parent HandGrabInteractable.Colliders[] and as soon as you put your hand inside that parent colliders you get missing collider error. so it didn't update the Colliders[] when destroyed. The only fix I found was to add: public void RefreshColliders() { Colliders = Rigidbody.GetComponentsInChildren<Collider>(); } in HandGrabInteractable.cs and call it in my fixedUpdate in the case a children collider has been deleted. But I will need to change that each time i update the sdk so it's not ideal. Is there a cleaner way that I missed to achieve the same result? Right now the Colliders are only set in the Start() function of HandGrabInteractable, in my opinion it should subscribe to those children and refresh when they got destroyed. Thanks for your consideration325Views0likes0CommentsInconsistency in Interaction SDK Versions
Hey, I recently started developing for Quest an app in Unity, and when I was installing packages separately I noticed that the newest version of Interaction SDK for some reason is not posted in Asset Store - the latest there is 66.0.0 and this caused me a bunch of weird errors and I ended up going to Meta's website to download directly the 67.0.0334Views0likes0CommentsMeta XR Interaction SDK Samples: Once you touch the Right controller, the right hand as issue
Hello, Meta XR Interaction SDK OVR Samples 65.0.0 Unity 2022.3.31f1 Quest 3 In the Sample Scenes : HandGrabExamples & HandGrabUseExamples You can see the same wrong behavior: Once your right controller becomes active by picking it up and grabing something, once you put it down you cannot use your right hand anymore to grab. The poke interaction is still working tho! The combinaison left hand/left controller doesn't show that issue, it behaves normally. I'm quite sure it's stupid, I tried to find the difference between the right and the left without any success. For all of those examples it's only appearing in Play test in unity (over link cable) once I built, it works. I can't grab anything with my right hand since yesterday in my game... Another issue I have is that when using concurrent hands et controller I can see when pinching with the right hand the A key beeing pressed on the synthetic right controller, how do you turn that off? Thanks a lot for your help.972Views0likes1CommentMultimodal without both controllers (using only one controller)
When testing the `ConcurrentHandsControllersExamples` example scene I'm experiencing an issue where if a controller isn't on and connected the controller model renders in my hand where i would expect to see my tracked hand. Once i turn that controller on the detached controller and my tracked hand appear as expected. Is it required that both controllers be on and connected to "properly" use multimodal mode? I have a usecase where I'm using a controller as a tracker on a larger object but still want to use hand tracking to interact with UI elements and this disables hand tracking on the affected hand. I don't see any mention of this requirement or behavior in https://developer.oculus.com/documentation/unity/unity-multimodal/ ,I do see a section describing "Single controller game play" where i can use only one controller but it doesn't mention that the unused controller must be on and connected.461Views1like0Comments