Hand tracking system gestures, possible solution?
Hi! I am currently developing for unreal and I find the hand tracking will be perfect for my use. The issue is that of all the gestures we can use, the thumb and index pinch is by far the most accessible and easy gesture to use and for the headset to recognize accurately, and that is a gesture that is system reserved. Even if I try to close my fist and then pinch it still sometimes recognizes the pinch as a system gesture instead. It would be unreasonable to request access to system gesture behavior for obvious reasons as apps and programs shouldn't have that access to begin with. I'm not sure what would be the best solution here, as I'm probably not alone in feeling a bit hamstrung trying to use such an awesome feature as hand tracking but being constantly thrown to the menu while using it. Some proposed solutions: -Make it possible to change the behavior in the headset menu so you would have to hold the palm up pinch for X amount of time before triggering the system menu. 1 sec, 2 sec, 5 sec, 10 sec for instance. Maybe with a ring closing around the meta icon to indicate how far away from opening the menu you are. -Make it possible to change the behavior from thumb and index pinch to something less used, like thumb and pinky pinch, or requiring both hands to do the system pinch. -Personally I would love to be able to relegate that system menu to the power button, so that if you are within an application, the power button would bring up the system menu. (Then long press to switch off the headset) Any feedback or possible workarounds would be much appreciated. Best regards1.1KViews5likes0CommentsHand Tracking over Link still not working in Unity Editor
Hi, I have spent the last two years developing for the Quest 2, and recently got the Quest 3. It's a great device and I'm super happy with it. There is just one big problem standing between me and total developer bliss. How is it possible that we still don't have a stable, robust hand tracking implementation across the whole development pipeline in 2023? I'm confused as to how Meta envisions developers to take full advantage of their (really good) hand tracking tech, if there are consistent inconsistencies, fumbling around, trying seven different versions of all of the little SDKs, components, etc. Can someone please advise me on how to achieve a simple Unity scene using the standard Oculus Integration, where I can just click "Play" in the editor and get hand tracking working over my Link cable? So far I have gone through five different Unity versions from 2021-2023, even more different Oculus Integration SDK versions (v50-57), and three different headsets (Quest 1, 2 and 3). Nothing worked. The only way that I have managed to have hand tracking working in the editor via link is to use a long deprecated version of the Oculus Integration, where I had to manually select "Oculus -> Tools -> OVR Utilities Plugin -> Set OVRPlugin to Legacy LibOVR+VRAPI" from within Unity. Three things. First, this option is no longer available in the later Oculus Integration versions. Second, selecting this option explicitly disables the possibility of building for the Quest 2 and 3. So you'd have to switch back and forth between the old LibOVR+VRAPI and the newer OpenXR integration, just to get hand tracking working in the editor. Really? Third, we as developers cannot reasonably be expected to stick to this legacy API, as all of the newer mixed reality features, like scene understanding, spatial anchors, etc. are not supported in the old version. Hence I want to ask my question one more time: how does Meta expect people to develop for their platform? Please let me know if you have an answer to this dilemma, I am grateful for any pointers! Note: I am explicitly talking about hand tracking through the Unity Editor using Link, in the standalone Android builds it works fine and it's amazing to use! Best, MaxSolved9KViews3likes14CommentsStarting Oculus Link directly in PC-Desktop View
I want to use Oculus/Quest Link directly for development and keep my headset on for some time. Since it always disconnects for different reasons I have to restart Quest Link quite often. Is there any way to start the DesktopView of my PC directly on starting Quest Link? It's very annoying to always activate it manually. Also, why is there no handtracking in Quest Link? You changed the name, activate the features!601Views2likes0Comments[Feature Request] Integrate Tracked Keyboard with Meta Remote Desktop
I've been experimenting with Tracked Keyboard (Logitech MX Keys Mini) and Workrooms recently and have noticed a few UX/usability issues. The tracked keyboard communicates directly with the Quest, which makes perfect sense if you plan to use it as a direct input while using the Quest. However in Workrooms, when you bring your PC (or Mac in my case) in to the workroom, the keyboard does not control the PC. This is a major source of friction from a usability/UX perspective. A possible solution, and hence the feature request, would be to integrate the tracked keyboard with Meta's Remote Desktop app as the intention in this scenario is to control and directly input in to the PC. This way the user sees the keyboard (passthrough), the keystrokes are inputted directly to the PC, and Remote Desktop can pass keyboard events, positioning, etc. to the Quest. It does throw up the issue of two modes of Tracked Keyboard (one directly to Quest, the other to Remote Desktop), however cognitively this is easier for a user to manage than having two separate keyboards on the same desk (one for the Quest another for the PC), or to have the disconnect that the Tracked Keyboard doesn't directly control the applications on the PC that are brought in to Workrooms. I hope this helps. rgds Dave2.3KViews2likes3CommentsHand Tracking Bone Position
I need to be able to access the hand tracking bone ID data and save it for offline analysis. I have seen that it can be retrieved from the hand information as an OVRSkeleton list but I would like to extract that information to use later on. So, I need to be able to read and store all the hand position data in a Unity scene... Anyone has been able to do it? I hope someone can help me out here... Thanks 🙂1.7KViews1like0CommentsGesture Recognition AI
I've been working on a gesture recognition AI which trains any gesture (whether from your hand or a controller) in about 30 seconds. It's been tested for VR extensively, and also works with Android (as well as any other motion data). There's a Unity plugin already built. But the main thing I'd really like is to see if anyone wants to use it with the new hand tracking functionality on Quest. This AI will make it really easy to make a controller-free "spellcasting" style game where different gestures cast different spells - or anything else you can think of! Of course it works with non-Quest headsets as well for controller movement. Looking for 3-5 early adopters to get on board and save yourself a lot of time programming gestures! You can see more about it by googling "MiVRy" (can't post links here for some reason) If you'd like to try it out and/or become an early adopter hit us up at support@marui-plugin.com Cheers all, have a good day.1KViews1like2CommentsHands Model controlled by controllers (no hands tracking)
Looking for instruction for new Oculus Quest Meta XR SDK (All-in-One) how to add a hand model instead of a controller model. For classic game mod: control of hands using a controller (press the grip - your hand clenches into a fist, etc). YouTube is full of videos on how to do this using XR Integration tool kit, but no similar instruction for Meta XR Sdk. Unity Engine1.4KViews1like1CommentMeta Avatars 2 SDK with Oculus Interaction SDK hands
Has anyone fully figured out the process for using the Oculus Interaction SDK synthetic hands as a data source for the Meta Avatar? There have been multiple posts centered around this, but no real solution yet. Here's a couple of screenshots showing what I'm working on: In this screenshot, the avatar is mostly synced with the synthetic hand, because both use the hand tracking and Hand positions, and the synthetic hand isn't interacting with anything. But as soon as the synthetic hand is interacting with something, such as by touching a PokeInteractable, or using a hand pose after grabbing an object, the avatar (because it doesn't respect the Synthetic Hand) clips through any buttons that the user might be trying to interact with, or doesn't follow the hand pose leading to an unnatural-looking grab. Has anyone actually figured out a complete solution for this?2.9KViews1like4CommentsSync issue with oculus integration handpose and meta avatar?
Hi All, Has anyone faced the issue where there is a sync issue between Oculus avatar hands not properly aligning with hand pose from oculus integration while grabbing, If anyone has a solution for this then please let me know.1.6KViews1like3Comments