Hand Tracking over Link still not working in Unity Editor
Hi, I have spent the last two years developing for the Quest 2, and recently got the Quest 3. It's a great device and I'm super happy with it. There is just one big problem standing between me and total developer bliss. How is it possible that we still don't have a stable, robust hand tracking implementation across the whole development pipeline in 2023? I'm confused as to how Meta envisions developers to take full advantage of their (really good) hand tracking tech, if there are consistent inconsistencies, fumbling around, trying seven different versions of all of the little SDKs, components, etc. Can someone please advise me on how to achieve a simple Unity scene using the standard Oculus Integration, where I can just click "Play" in the editor and get hand tracking working over my Link cable? So far I have gone through five different Unity versions from 2021-2023, even more different Oculus Integration SDK versions (v50-57), and three different headsets (Quest 1, 2 and 3). Nothing worked. The only way that I have managed to have hand tracking working in the editor via link is to use a long deprecated version of the Oculus Integration, where I had to manually select "Oculus -> Tools -> OVR Utilities Plugin -> Set OVRPlugin to Legacy LibOVR+VRAPI" from within Unity. Three things. First, this option is no longer available in the later Oculus Integration versions. Second, selecting this option explicitly disables the possibility of building for the Quest 2 and 3. So you'd have to switch back and forth between the old LibOVR+VRAPI and the newer OpenXR integration, just to get hand tracking working in the editor. Really? Third, we as developers cannot reasonably be expected to stick to this legacy API, as all of the newer mixed reality features, like scene understanding, spatial anchors, etc. are not supported in the old version. Hence I want to ask my question one more time: how does Meta expect people to develop for their platform? Please let me know if you have an answer to this dilemma, I am grateful for any pointers! Note: I am explicitly talking about hand tracking through the Unity Editor using Link, in the standalone Android builds it works fine and it's amazing to use! Best, MaxSolved9KViews3likes14CommentsFull body tracking through Oculus runtime?
I am building a VR installation that demands full body tracking. Currently it's built with Unreal Engine and uses Vive Trackers and an IK system for the MOCAP. However, I'm hoping to incorporate face and hand tracking from the Quest Pro, in combination with the full body tracking, for even more immersion. This raises a lot of technical issues and I was wondering if someone here could shed some light... 1. Is it possible for Oculus runtime to access Vive trackers? Or is this strictly SteamVR? And if it is SteamVR, is it possible to have two runtimes communicating with each other or is this impossible? 2. So if the Vive trackers are impossible... What full body solutions exist for Oculus runtime? Is it possible to use something like SlimeVR (I read that this was also SteamVR). Would a MOCAP suit, like Perception Neuron, work? Is there another solution I'm missing? In short... How the heck do you get a full body capture, including hand and face, on the Quest?Solved4.7KViews0likes3CommentsMeta Avatars 2 SDK with Oculus Interaction SDK hands
Has anyone fully figured out the process for using the Oculus Interaction SDK synthetic hands as a data source for the Meta Avatar? There have been multiple posts centered around this, but no real solution yet. Here's a couple of screenshots showing what I'm working on: In this screenshot, the avatar is mostly synced with the synthetic hand, because both use the hand tracking and Hand positions, and the synthetic hand isn't interacting with anything. But as soon as the synthetic hand is interacting with something, such as by touching a PokeInteractable, or using a hand pose after grabbing an object, the avatar (because it doesn't respect the Synthetic Hand) clips through any buttons that the user might be trying to interact with, or doesn't follow the hand pose leading to an unnatural-looking grab. Has anyone actually figured out a complete solution for this?2.9KViews1like4Comments[Feature Request] Integrate Tracked Keyboard with Meta Remote Desktop
I've been experimenting with Tracked Keyboard (Logitech MX Keys Mini) and Workrooms recently and have noticed a few UX/usability issues. The tracked keyboard communicates directly with the Quest, which makes perfect sense if you plan to use it as a direct input while using the Quest. However in Workrooms, when you bring your PC (or Mac in my case) in to the workroom, the keyboard does not control the PC. This is a major source of friction from a usability/UX perspective. A possible solution, and hence the feature request, would be to integrate the tracked keyboard with Meta's Remote Desktop app as the intention in this scenario is to control and directly input in to the PC. This way the user sees the keyboard (passthrough), the keystrokes are inputted directly to the PC, and Remote Desktop can pass keyboard events, positioning, etc. to the Quest. It does throw up the issue of two modes of Tracked Keyboard (one directly to Quest, the other to Remote Desktop), however cognitively this is easier for a user to manage than having two separate keyboards on the same desk (one for the Quest another for the PC), or to have the disconnect that the Tracked Keyboard doesn't directly control the applications on the PC that are brought in to Workrooms. I hope this helps. rgds Dave2.3KViews2likes3Commentsovr camera rig not working
I wanted to use hand tracking in VR using oculus quest 2. When I connect the VR headset using XR rig camera, it worked fine. But it is not working when I use OVRCameraRig. When I hit the play button, nothing is playing in my oculus headset, but there are no errors. Does anyone have any ideas why the OVRCameraRig is not working? I update unity2021.1.23f and download XR interaction toolkit 1.0.0, XR plugin Management 4.1.0, oculus xr plugin 1.10.0, openXR plugin1.2.8 packages.2.2KViews0likes1CommentPose Detection Joint Distance
Hi, is there any documentation on how to use Joint Distance Active State in Oculus/Interaction/Runtime/Scripts/PoseDetection Looks like I can use it to create two hand poses. Looking for more details on how this works. Also would like to know if there are any documentations/guides on how all the other scripts under pose detection works. I have opened all the samples, not all of them are covered in the sample. Thank you.Solved2.1KViews0likes1CommentHand Tracking Bone Position
I need to be able to access the hand tracking bone ID data and save it for offline analysis. I have seen that it can be retrieved from the hand information as an OVRSkeleton list but I would like to extract that information to use later on. So, I need to be able to read and store all the hand position data in a Unity scene... Anyone has been able to do it? I hope someone can help me out here... Thanks 🙂1.7KViews1like0CommentsSync issue with oculus integration handpose and meta avatar?
Hi All, Has anyone faced the issue where there is a sync issue between Oculus avatar hands not properly aligning with hand pose from oculus integration while grabbing, If anyone has a solution for this then please let me know.1.6KViews1like3CommentsHand tracking update root scale not working
I'm trying to use Hand tracking in my app and no matter what the Hand scale stays at one event with a friends hands that are much smaller. After some investigation, and a lot of debugging, I found that the hand scale is calculated for the first frame of the application and is at like 1.1 before it gets switched back to 1 for ever. A "solution" I found is to switch of the update root scale parameter of my hands and I could then scale them depending on this initial value but as per the documentation says, the root scale is supposed to get updated during runtime. (The documentation is pretty empty on everything though and it's never detailled how they are supposed to be mesuring that). Does anyone managed to have the root scale update for their hand tracking ? If yes, could you share some insight with me ?Solved1.6KViews0likes2CommentsHands Model controlled by controllers (no hands tracking)
Looking for instruction for new Oculus Quest Meta XR SDK (All-in-One) how to add a hand model instead of a controller model. For classic game mod: control of hands using a controller (press the grip - your hand clenches into a fist, etc). YouTube is full of videos on how to do this using XR Integration tool kit, but no similar instruction for Meta XR Sdk. Unity Engine1.4KViews1like1Comment