Unity 6.1 + Meta Quest + Building Blocks + Simple Interactable
Probably so simple... but I've been stopped here for days... Steps so far: I start a new project with Unity 6.1 with template "Mixed Reality (MR) core" Switched to Android platform on build profiles Imported Meta XR all in one package Imported Meta MR Utility Kit package Removed all items on my sample scene Executed Meta XR Tools -> project setup tool Selected OpenXR Plugin on the XR plugin management Added Building Blocks "Camera Rig" + "Passthrough" + "Controller Tracking" + "Ray Interaction" Placed a game object (mesh + material) on the scene and added a collider to it and a "Ray Interactable" script Up to this I run and everything goes right: my object appears on the scene, the passthrough works well and I can use my hands or the controllers to display a ray that impacts the object with the required cursor. What I cannot reach is to detect ray events on the object: I would like to change the color of the object material when the ray enters into the object or to make the object dissapear if I press the trigger on the controller when the cursor is on it (I guess the 'select' or 'activate' event, I tried both). So simple... but so hard to achieve... I have tried to add a "XR Simple Interactable" script the same level as the "Ray Interactable", but does not work. Is there anything on my project config wrong? If I use the "Project Validation" on the XR Plugin Management there are no error nor warnings. Is there anyone who can give me some light on it? maybe any free o paid tutorial on how to configure the environment for this case? Many thanks in advance!133Views0likes1CommentTransformRecognizerActiveState is never activating
I haven't been able to get a Transform Recognizer Active State to register as active. I have no issues with Shape Recognizer Active State. Are there any additional steps that are required in order to get a TransformRecognizerActiveState to work? I have also examined it with an Active State Debug Tree UI and it confirms that the shape recognizer is activating but not the transform recognizer. Here is my component setup: Can anyone see or guess what I might be doing wrong?Solved2.5KViews0likes4CommentsDllNotFoundException: InteractionSdk in samples scene
Hi, On a new project, tried on Unity 2021.3.43 and 2022.3.41 . I install Meta XR All-in-One SDK v68. I do all the project setup. I download the samples scenes. But when i want to launch one (TouchGrabExamples for exemple). I always have the error : DllNotFoundException: InteractionSdk assembly:<unknown assembly> type:<unknown type> member:(null) Oculus.Interaction.Input.Filter.HandFilter.Awake () (at Library/PackageCache/com.meta.xr.sdk.interaction@68.0.0/Runtime/Scripts/Input/Hands/DataModifiers/HandFilter.cs:144) I have looked for hours for a solution to this problem. I have found a lot of forum posts talking about this error but no fix. What can I do to fix this error ? Thank you in advance !1.5KViews0likes5CommentsCheck if object is snapped
Hi everybody, I am trying to implment an interaction where the user (1) grabs an Interactable object and snaps it to a predefined position and then (2) has the option to translate transform that same object to a specific position. I did encount a few problems/question that I couldn't solve yet. 1. How can I check, if an object is snapped to a predefined snaplocation? I want to achieve, that I then can deactivate and activate the relevant Grab Transformers and maybe snap locations 2. How can I check, if an object is transformed (through one grab translate transformer) to a specific position and then lock it there while still active? I want the object to be able to be translate-transformed again, but only on a release and grab, not while keeping it grabbed (active). Any help is greatly appreciated. I suppose I have to code a little bit which I don't have a big problem with, however a starting point would be of great help to me. Thank you 🙂1.2KViews0likes4CommentsInconsistency in Interaction SDK Versions
Hey, I recently started developing for Quest an app in Unity, and when I was installing packages separately I noticed that the newest version of Interaction SDK for some reason is not posted in Asset Store - the latest there is 66.0.0 and this caused me a bunch of weird errors and I ended up going to Meta's website to download directly the 67.0.0345Views0likes0CommentsSOLVED: Hand Tracking not working in Unity Editor or Windows PC Build
EDIT: This was solved in the v62 update! https://communityforums.atmeta.com/t5/Announcements/Meta-Quest-build-62-0-release-notes/ba-p/1145169 I have been attempting to use Hand Tracking over either Air Link or Quest Link for use in a Windows PC build. After setting up a project and playing a sample scene, the tracked hands are not visible. Hand Tracking is working on the device in the Quest OS and during Quest Link. When the unity app is running and my palms are facing the headset the oculus menu buttons are visible but not the hand mesh. Steps to reproduce: Create new Unity project. Install Oculus Integration and XR Plugin Management (select Oculus as Provider) Open any Hand Tracking supported Scene (I mainly am interested in the Interaction SDK) Hands will not be visible. Depending on the scene, the hand mesh can be seen not moving from it's initial position. Tested on multiple computers (Windows 10 & 11), and multiple devices (Quest 2 & Pro). Both Quest devices are on v47. I have tested this with Oculus Integration v46 and v47.23KViews1like15CommentsInteraction SDK Samples - Catch object
In the TouchGrab sample scene I can pick up the chesspiece with one hand and then drop it into my other hand. I copied this chesspiece to the HandGrab sample scene, to get a combined test scene, but now it falls straight through the hand. I tried to compare the two OVRHands objects and their children in the OvrCameraRig, but I can't see differences. Can anyone tell me what makes the catching possible? Thanks882Views0likes2CommentsInteraction SDK Teleport change trigger mapping.
I am trying to change the controller mapping for Teleport using Meta Interaction SDK Locomotion. I can see its set in Joystick Axis 2D Active State in the Selector and TeleportActiveState. But I would like to use buttons 2 and 4 to trigger Teleport. I used OVRButtonActiveState instead of the Axis2DActiveState scripts. But the Arc don't render. Has someone been able to change the teleport button to something else? Having it in the joystick creates a lot of undesired teleport actions and users are complaining a lot about it.Solved2.2KViews0likes4CommentsIs the Interaction SDK enough for a VR App? would it be better to start from the All-In-One?
I still need to find my way through all the SDKs proposed by Meta. I'm a little bit confused about the one to pick... from what I can see the Interaction SDK is enough to create a VR experience, without the need to install the All-In-One pack... but I have a couple of questions: - Core XR SDK: the name sounds like something that MUST be included. In my project I haven't included it but I can compile and run my app without any issue. Is this just a misleading name? - Audio XR SDK: could I use the default Unity system or should I use this SDK? what's your take? - Interaction SDK: Am I correct saying that this is the only SDK I need to have a VR App (...capable to manage handtracking and controllers) ? In general I feel that for my App I don't need all the stuff included in the All-In-One. Big_Flex could you help with this question 🙂 ?Solved2KViews0likes5CommentsHow to use ReticleLine on objects that aren't DistantHandGrabInteractables
I'm following this tutorial on creating ghost line reticles: https://developer.oculus.com/documentation/unity/unity-isdk-create-ghost-reticles/#ghost-line The ReticleLine works great with objects setup for Distant Hand Grab Interactables - is there a way to use this for objects that aren't grabbable? e.g I'd like to use the ReticleLine behaviour to detect hover on game objects but not include any grabbable behaviour. Thanks!658Views0likes1Comment