Handtracking in PC SDK
Hi. Is it possible to do handtracking with the PC SDK? Reading through the documentation didn't help. We build simulators, and our product currently uses Leap Motion to do the tracking and interact with virtual buttons. For what I understood this is only doable in Unity, am I wrong? Can someone confirm this is doable outside of Untiy/Unreal Engine? Thanks.2.4KViews4likes2CommentsHand Tracking + Windows Build?
I was wondering if anyone has been successful in getting hand tracking to work with a unity build targetted for windows? Considering you can use hand tracking with the oculus link inside Unity editor, surely this can be done for a build too...? The application I'm creating needs to have 2 users: One sat at the PC, controlling vehicles. One using the headset guiding the vehicles. It's kind of important that hand tracking is on as it allows the use of all fingers and gestures (unlike controllers). I know I could network this and sideload onto the Quest 2 to get the PC and the headset to communicate over a server (multi-platform compatibility). However, this increases the scope of the project... Any suggestions?1.6KViews3likes3CommentsSwitching between hand tracking and controller and using both at the same time
Hi, I have two questions if anyone can help I would really appreciate it. 1- Is there a way two switches between hand tracking and controller using C# script in Unity at run time, right now it seems to me that the only way to switch between the tracking mode is to go to OVRCameraRig ->Hand Tracking Support -> Set the tracking mode ~ I wanna know how can I do this using script. 2- Is there a way to use hand tracking and controller tracking at the same time? (Like left-hand uses controller and right-hand uses hand tracking) Right now even if I set the tracking mode to both hand tracking and controller tracking, they can not work at the same time, only one of the tracking option can work at the time. Thank you.6.3KViews3likes6CommentsHand Tracking on Oculus Quest 2 using Windows Unity Build (with Oculus Link)
Hi, I have checked Oculus is allowing to use their hand tracking system using Oculus Link only in Unity Editor, but I would like to ask if it is possible to use the hands tracking system on Windows Unity Build. Is it possible to this currently? Is it possible to do this in a previous version of Oculus Integration? There is any chance for me to use this hands tracking system on a Windows Unity Build? Thanks! Best regards.1.9KViews2likes5CommentsMeta XR Plugin - Hand Tracking Testing in Editor
Hi, I have downloaded the MetaXR plugin from this link https://developer.oculus.com/downloads/package/unreal-engine-5-integration/46.0/ and added it to a fresh unreal engine 5.0.3 project. I have then created a very basic pawn with a camera in it, two motion controllers and two oculus xr hand components. I have set up everything exactly the same way that I would do with 4.26/4.27 when using the OculusVR plugin and the oculus hand component. When I test in the editor with the VR preview I cannot see my hands. The OculusVR plugin and the oculus hand component allowed to easily test in the editor and did not require to launch on the device. Is this not possible anymore with the MetaXR plugin? Am I missing something obvious or any required extra steps? Thanks in advance, Simone6KViews1like4CommentsUnable to use the hand tracking menu function
I found a few posts asking about this but none of them went anywhere or solved the issue. Your left tracked hand has a menu function that loads up on a small circle. Is there any way to actually use that in app? Is there a callback or a button press to detect for the input? I would like to open a menu using that interface but I cannot find an example anywhere of someone being able to use it properly. I have gesture detection working for pinching and other things but I would like to hook up to the Meta interface for having a delayed input on opening the menu. Any help would be much appreciated. I am in Unity 2020 using OVR v1.70 SDK38936Views1like0CommentsOculus Integration hand grab not working
I set rigidbody,box collider,hand grab interactable,grabbable,interactable group view and interactable debug visual to a cube. When my hand reaches the cube and grab it, the color of the cube changes, meaning that the cube has detected grabbing behaviour correctly. However, the cube cannot move. The newest integration version has no grabbable variable in HandGrabInteractable, so the Grabbable component is not referenced and useless. Probably it's wrong. How can I fix this? (In the old version, you can drag Grabbable component to the corresponding variable in HandGrabInteractable and the cube moves when being grabbed.2.6KViews1like3CommentsHandtracking app - true recenter API for Unity
Hi all! I'm making an hand-tracking game as a seated or standing experience. While standing, the user sometimes move around a bit, after time its offset causes the game to act weird for obvious reasons. While recentering with controller is very easy and intuitive, doing so with hand tracking is pretty complex, therefore I'm doing an automatic recenter if an offset is detected. It works with Link but not with Quest 2. Is there a way to do a recenter in Unity without moving the whole game objects in the scene? In Unity, this line doesn't work for the Quest 2 (does work for Link): OVRManager.display.RecenterPose(); With controllers it's ok, I guess, but for hand-tracking I believe we need a better solution than disabling this option. Alternatively - is there a way to mark the experience as seated/standing and make this line work for the Quest 2? Many thanks! Similar to this subject1.2KViews1like0CommentsActivating/Deactivating Hand during runtime (Handtracking enabled)
Hey, is there a proper way to deactivate any hand and activate it back later on? I tried deactivating the skinned mesh renderer of the hand prefab, but that seems to get enabled again and again automatically. Deactivating the whole gameobject works but when I activate it later, it doesn't appear anymore.535Views1like0CommentsOculus Link + Windows Build + Handtracking + Unity
Hello I know that you can have handtracking inside unity editor, but is it possible to: 1) Build app (just to windows platform) with hand tracking support in project 2)Then connect quest by oculus link 3) Run the app and have hand tracking? I was testing and experimenting and it doesnt seem to work. What is interesting if I move my hand outside the guard area it detects it. I guess that it is not supported yet or will never be. Anyone knows something about it?5.9KViews1like11Comments