Quest 3 lost all tracking
Brand new quest 3 just arrived, configured an everything was doing pretty well. Was just logging into a game when all tracking just stopped working, a window saying "Tracking lost" showed up, but I couldn't click anything since the controllers stopped working too, hand tracking also wasnt working. I tried cleaning up the cameras with a very soft towel, tried rebooting, resetting to factory settings, everything I could. The screen is either completely black, black with a dot or on the starting screen asking to turn on the controllers, but the screen just moves randomly and isn't following the headset movements at all. There is a white dot blinking on the left controller and they keep vibrating every few seconds.
Solved66KViews8likes169Commentsquest 3 keeps turning off and on by itself repeatedly, but when back on, no controllers/hands work.
Quest 3 manufactured in December 2023 keeps turning off and on during use, usually after using a data cable either from my pc or from my piano's usb output (for midi which incidentally I can't get to transmit). Each time it comes back on, I'm presented with the pin screen, but I'm only able to move the pointer with my headset, not with my controllers or hand tracking. If I move my head more than 30 degrees I can no longer move the points, and if I press a button on the controller I can no longer move the controller, but repeatedly pressing the button on the controlller after I cant move the pointer, still causes the onscreen pointer dot to flash in response. A few seconds later it will turn off and then repeat the above process. Powering off the device and restarting it seems to allow it to work for a little while before it does it again. I'm not doing anything intensive in it, just meta workroom, and the first time it happened was in PianoVision. Battery is full charged. I've only had the unit 2 days. It's not overheating, because the first time it happened I'd only just turned it on. Strange thing is, since installing the meta remote desktop onto the PC, I experienced a similar issue with my PC when trying to take it out of sleep mode this morning. I think this might be coincidental though, because I wasn't using airlink when it first happened in PianoVision. Could the repeatedly going in and out of standby and the failure to receive midi be caused by a faulty usb-c socket on the quest 3?23KViews1like30CommentsSOLVED: Hand Tracking not working in Unity Editor or Windows PC Build
EDIT: This was solved in the v62 update! https://communityforums.atmeta.com/t5/Announcements/Meta-Quest-build-62-0-release-notes/ba-p/1145169 I have been attempting to use Hand Tracking over either Air Link or Quest Link for use in a Windows PC build. After setting up a project and playing a sample scene, the tracked hands are not visible. Hand Tracking is working on the device in the Quest OS and during Quest Link. When the unity app is running and my palms are facing the headset the oculus menu buttons are visible but not the hand mesh. Steps to reproduce: Create new Unity project. Install Oculus Integration and XR Plugin Management (select Oculus as Provider) Open any Hand Tracking supported Scene (I mainly am interested in the Interaction SDK) Hands will not be visible. Depending on the scene, the hand mesh can be seen not moving from it's initial position. Tested on multiple computers (Windows 10 & 11), and multiple devices (Quest 2 & Pro). Both Quest devices are on v47. I have tested this with Oculus Integration v46 and v47.23KViews1like15Comments[Hand Tracking] New Oculus Start Button
Hello, With the last update 17, i've noticed that there's now the start button showing within the left hand, like the right hand do with the oculus button. I've also noticed that Waltz of The Wizards use this button to trigger a Menu Settings. My question is : Which script is managing this new button ? I searched OVRHand.cs but didn't found anything. I also tried the OVRInput like i would do if i want to code the start button with the Touch Controller but it didn't seem to work. I want to use this feauture to create a Menu for my game, but didn't find any documentation about it so far. Do someone have found where he is generated in Unity and how we can customize it ? Have a good day everyone13KViews4likes24CommentsEnable hand and controller tracking at the same time.
Hi, I have a Oculus Quest Pro, and work on a Unity project that needs hand tracking and controller tracking for a physical object, but I can't enable hand and controller tracking at the same time. So I wonder is this possible? or is there any other ways to track a physical object using Oculus?12KViews5likes15CommentsSetting up Multimodal hands & controllers
Hi, I am trying to run Multimodal feature outlined in Meta recent blog posts (https://www.uploadvr.com/meta-quest-pro-simultaneous-hands-controllers-mode/). My device is Quest Pro with OS version 57.0.0.261.669 and Quest Pro controllers. I am working in Unity 2022.3.10. So far, I am failing to make this feature work. The settings of my OVRManager are: When I try to run it on device, I am getting the following message in the logs: "Failed to set multimodal hands and controllers mode!" Do you have any idea what I am doing wrong?12KViews0likes25CommentsHand Tracking over Link still not working in Unity Editor
Hi, I have spent the last two years developing for the Quest 2, and recently got the Quest 3. It's a great device and I'm super happy with it. There is just one big problem standing between me and total developer bliss. How is it possible that we still don't have a stable, robust hand tracking implementation across the whole development pipeline in 2023? I'm confused as to how Meta envisions developers to take full advantage of their (really good) hand tracking tech, if there are consistent inconsistencies, fumbling around, trying seven different versions of all of the little SDKs, components, etc. Can someone please advise me on how to achieve a simple Unity scene using the standard Oculus Integration, where I can just click "Play" in the editor and get hand tracking working over my Link cable? So far I have gone through five different Unity versions from 2021-2023, even more different Oculus Integration SDK versions (v50-57), and three different headsets (Quest 1, 2 and 3). Nothing worked. The only way that I have managed to have hand tracking working in the editor via link is to use a long deprecated version of the Oculus Integration, where I had to manually select "Oculus -> Tools -> OVR Utilities Plugin -> Set OVRPlugin to Legacy LibOVR+VRAPI" from within Unity. Three things. First, this option is no longer available in the later Oculus Integration versions. Second, selecting this option explicitly disables the possibility of building for the Quest 2 and 3. So you'd have to switch back and forth between the old LibOVR+VRAPI and the newer OpenXR integration, just to get hand tracking working in the editor. Really? Third, we as developers cannot reasonably be expected to stick to this legacy API, as all of the newer mixed reality features, like scene understanding, spatial anchors, etc. are not supported in the old version. Hence I want to ask my question one more time: how does Meta expect people to develop for their platform? Please let me know if you have an answer to this dilemma, I am grateful for any pointers! Note: I am explicitly talking about hand tracking through the Unity Editor using Link, in the standalone Android builds it works fine and it's amazing to use! Best, MaxSolved9KViews3likes14CommentsUse Take Recorder to record animations from tracked hands
Hi everyone, I have hand tracking activated and am trying to record the hand components using the Take Recorder plugin. My goal is to capture hand animations which I can later on apply to a NPC without having to manually animate hand poses. I was able to record animations from the official HandSample when using the controllers to manipulate the hand mesh. But when switching to hand tracking the hand poses are not recorded. Instead, only rotation and position of the hand component are stored in the sequence. I hope I managed to express what I am trying to achieve in a comprehensible manner. Has anyone tried to record hand animations from tracked hands using Take Recorder before and got it running? Or has an idea what I am missing? Thanks for your help!Solved8.2KViews0likes11CommentsHow to get Hand Joint Location from Hand tracking
Please point me into the right direction. Looking into the OVRPlugin code I can not find a way to get Hand Joint Location, seems like Oculus doesn't provide us with that data...is it correct? I can see that there is a root pose in OVRPlugin.HandState struct, but that's it. What I'm trying to do: well, I want to transfer (and convert) Oculus hand tracking data into Leap motion data format. It will give me a better way to work with hand tracking data as well as to re-use an old Leap codebase. This will also give me access to Leap Motion interaction engine on Oculus Quest.6.8KViews2likes2CommentsGetting head and finger orientation
Hey gang - glad to see a great support forum for the Oculus Quest 2 with Unity. I'm migrating an app from open XR to oculus to take advantage of the oculus hand tracking. I'm experiencing issues with the data (or lack of) and wondering if anyone can provide advice. I'm using our existing client/server framework to send data to the server. The data is being sent and received correctly - but the data is strange. The finger data never changes from the skeleton. During testing, I'm only sending the tip of the index finger on the right hand. The outcome is to get the amount of bend of each finger. The camera transform sends the correct data, and it's working great. We just can't figure out how to get the finger data to know the position of each finger. void Update() { // send the index finger data OVRPlugin.Skeleton skeleton; if (OVRPlugin.GetSkeleton(OVRPlugin.SkeletonType.HandRight, out skeleton)) foreach (var bone in skeleton.Bones) if (bone.Id == OVRPlugin.BoneId.Hand_Index3) _streamClient.SendPacket( StreamClient.CommandEnums.FingerRight0, MapToByte(bone.Pose.Orientation.x), MapToByte(bone.Pose.Orientation.y), MapToByte(bone.Pose.Orientation.z)); // send head data (pitch, yaw, roll) _streamClient.SendPacket( StreamClient.CommandEnums.pitchYawRoll, MapToByte(Camera.current.transform.localRotation.x), MapToByte(Camera.current.transform.localRotation.y), MapToByte(Camera.current.transform.localRotation.z)); } Any assistance is greatly appreciated - thanks in advance!Solved6.1KViews0likes8Comments