Update on the micro gesture support for hands ISDK
It was announced in Connect 2024 last month, but I am yet to see any signs of it. I am talking about the micro gestures with thumb and the index finger (similar to the d-pad gestures). From their research paper (https://dl.acm.org/doi/10.1145/3613904.3642702) Here is the connect video - Deepen immersion with state-of-the-art interactions and sensory capabilities | Meta Connect 2024 | Meta for Developers Any update on it? or did I miss it somehow?257Views0likes0CommentsMeta Interaction SDK: UseInteractable - Requirements to use or any documentation available?
Hi, I'm working with the Meta Interaction SDK (v62.0.0) and trying to get something to work with the UseInteractable component (besides the example Spray bottle). I've been referencing the Spray bottle closely, and for my custom object, I've written a script that implements IHandGrabUseDelegate, and assigned it to UseInteractable's "Hand Use Delegate" field. The object has HandGrabInteractables, etc. (similar to the Spray bottle). However, the methods from the interface (BeginUse, EndUse, and ComputeUseStrength) are not being called. I'm wondering if I'm missing something (probably am), and if there is any documentation on steps / requirements for the Use interaction to work. Thanks for any help!2.5KViews0likes4CommentsMeta XR interaction SDK Essentials Hand Tracking on PICO 4 Ultra
Hello everyone, Headset: Pico 4 Ultra Unity: 2022.3.49F1 Meta XR Interaction SDK Essentials 69.0.1 Pico OpenXR plugin 1.3.3 Unity OpenXR Plugin 1.12.1 XR Interaction Toolkit 2.6.3 Xr Hands 1.4.3 I try to test the new version of the SDK interaction for non Meta headset, it works very well with the controllers. When I want to go to the hands tracking, it appears well, the position follow up is ok, but when I close my hand nothing happens, no animations ??? In Project Settings, XR Plug-in Management, OpenXR, I put the Hand Interation Profile and in All Features, I have the hand tracking substystm, pico openxr features and pico support activated. I really don't see what I forgot to do? Thank you in advance for your answers - Edit : If I deactivate the prefab unityxrcontrollerhands and let unityxrhands activated in unityxrcameraiginteraction, hand tracking works. Apparently, does that manage to properly pass controllers to hand tracking in an application?1.1KViews0likes0CommentsApp crashes when grabbing attempt using Interaction SDK
I'm using Meta's fork of Unreal Engine 4.5.3 and I am using Interaction SDK. I tried to set up an actor to be grabbed, using the InteractionSDK, but whenever I pinch my object, the app crashes. The documentation is very vague on how to set it up, and I tried to copy the Hierarchy from the sample project, but I don't know what is causing the crash. I'm testing in a packaged app on my Quest Pro. The sample project does work as intended. I just added a BoxCollision to the actor IsdkInteractableWidget, and as a child to the BoxCollision, I added IsdkGrabbable component. As well as a IsdkGrabbableAudio. The widget does work with hand to poke and do raytrace interactions, so I am confident in the IsdkHandRigComponentRight and Left components.Solved1.4KViews0likes4Comments[Unity] Can't use controller to select button inside UI scroll
Hello, I'm trying to use Unity UI Scroll with UIButtons inside, but i can't get em to work with the controllers, but its working fine with hands gesture. The problem is with the sensitivity of the button to unselect, like if I can hold the controller without shaking its possible to selected but the normal use, the simple act of pressing the controller you move a little and the MetaQuest only registe the scroll movement and not the click.Solved1.3KViews0likes4CommentsImplement Multiple Hand Grab Pose Options for the Same Object in Meta XR SDK
Hi folks, I'm new to XR development and currently working on creating a grabbable object that responds to different hand grab poses depending on the player's holding rotation. I'm struggling to find any documentation or resources on how to achieve this, and I'm a bit stuck at the moment. If anyone has experience with this or can point me in the right direction, I would really appreciate your help! Thanks in advance! Ofir844Views0likes1CommentError: "Saving Prefab to immutable folder is not allowed" in Unity with Meta All-In-One SDK
I am setting up a fresh Unity project for a VR project that uses Meta All-In-One SDK. To do this, I created a new 3D built-in render pipeline project, and I imported the Meta-All-In-One SDK (ver.68, but this happens with the 66 as well) from the Package Manager. After the package is imported, I fix and apply all the reccomended fixes, set up the build for Android, and select Oculus as XR plugin manager. Right after doing these operations, and as soon as I save the project, immediately an error pops up: Saving Prefab to immutable folder is not allowed: Packages/com.meta.xr.sdk.interaction.ovr/Editor/Blocks/Interactors/Prefabs/[BB] Hand Ray.prefab UnityEditor.EditorApplication:Internal_CallGlobalEventHandler () This happens mainly on these two objects: BB hand ray, and OVR camera rig interaction. Important to note is that no game objects or prefabs were added to the scene (that contains only Main Camera and Directional Light). This is a problem because is affecting the Build and Run operation, returning in errors when building Player. I also had a try on another, more developed project, and this same error happens, but the Play mode with Quest Link is not affected at all and the app works perfectly, when not built. Does anybody have this same experience and problem? If so, does anyone came up with a solution? I would greatly appreciate any help. Thanks a lot2.1KViews5likes7CommentsBlocky, terrible realtime shadows in Quest app built in Unity.
Hello, I am building an architectural visualization app for meta quest devices using Unity 2022 and Meta Interaction SDK v68. I am using realtime lighting where the user is able to manipulate the time of the day using a slider in real time to see how their apartment would look under different light conditions. No matter what light / shadow setting I set, in the headset build the shadows look blocky and pretty hideous. I made sure to: - Remove the window glasses with transparent material which could have affected the quality of the light coming through, - Adjusted URP settings to have highest resolution shadows, adjusted the cascades to make sure highest quality shadows within 25 meters, - Adjusted the light / skylight settings to make sure I have highest shadow quality and soft shadows. When I manipulate the direction of the directional light in the editor, I get flawless shadows: But as you can see above, the build one suffers from blocky, glitchy looking light. I would be grateful for any insight into why this is happening and if there is a fix. Best,Solved1.5KViews0likes2CommentsRayInteractor -> Data (Optional field)
Hello, I'm working on pointer events and the RayInteractor -> Data field seems to be exactly what I need. Reading the helper text gives me the impression that I can assign the current eventData to the RayInteractor to identify and match an event with the RayInteractor. Which is what I want. But i'm not understanding how to utilize this Data field. It only takes an Object, meaning it can be pretty much anything from a sprite to a gameobject and I don't see how that works. The documentation doesn't tell me much, perhaps someone understands it better than me. https://developer.oculus.com/reference/unity/v67/class_oculus_interaction_ray_interactor/ Does anyone have an idea?415Views0likes0CommentsEverything breaks after updating SDK v64 to v67
I have been having problems implementing passthrough, so I thought that updating the sdk version would help. The core and interaction sdk's update just fine, but the all-in-one completely breaks the player controller and hands. The scripts turn into (Script) and when I try to reattach them it gives me an error popup so I can't fix it myself. It seems other people have had this same problem but I haven't found any solutions yet.875Views0likes2Comments