How to move Player in Hand tracking
How to do the movement of a player while using hand tracking in the oculus quest? Anyone have done Player movement? Have tried to move OVRCameraRig with joystick four-button forward/backward/left/right. OVRCameraRig moves perfectly but issue is both HAND Models are not moving with OVRCameraRig. Tried to Put "Hands" game object as a parent of OVRCameraRig but no luck Does anyone have an idea?9.2KViews0likes19Comments[Hand Tracking]way to get a hand position & rotation before build?
I use the new hand tracking feature and want to access to hands' transform before build. I tried LeftHandAnchor/RightHandAnchor to get the position and rotation, but the problem of it is it gives me the wrist position of a hand with hand tracking, while when hand tracking is off, it is the position of the center of the controller. Also, the anchor transform z+ isn't "forward", the direction an index finger faces to, but instead x+ is. Those just make it difficult to use the reference of hands transform. There are a lot of cases you want to get the transform, such as when making sphere GameObject with its collider for OVRGrabber as a hand's child, or setting up the VRIK target etc. It should be easy. If I can access to hand prefab which has hand mesh, then maybe I can use it for the reference while making a project. But I didn't find one in the latest package. Please let me know if anyone finds the hand prefab, or a way to get the transform before build. Thanks.6.4KViews1like4CommentsSwitching between hand tracking and controller and using both at the same time
Hi, I have two questions if anyone can help I would really appreciate it. 1- Is there a way two switches between hand tracking and controller using C# script in Unity at run time, right now it seems to me that the only way to switch between the tracking mode is to go to OVRCameraRig ->Hand Tracking Support -> Set the tracking mode ~ I wanna know how can I do this using script. 2- Is there a way to use hand tracking and controller tracking at the same time? (Like left-hand uses controller and right-hand uses hand tracking) Right now even if I set the tracking mode to both hand tracking and controller tracking, they can not work at the same time, only one of the tracking option can work at the time. Thank you.6.3KViews3likes6CommentsMeta XR Plugin - Hand Tracking Testing in Editor
Hi, I have downloaded the MetaXR plugin from this link https://developer.oculus.com/downloads/package/unreal-engine-5-integration/46.0/ and added it to a fresh unreal engine 5.0.3 project. I have then created a very basic pawn with a camera in it, two motion controllers and two oculus xr hand components. I have set up everything exactly the same way that I would do with 4.26/4.27 when using the OculusVR plugin and the oculus hand component. When I test in the editor with the VR preview I cannot see my hands. The OculusVR plugin and the oculus hand component allowed to easily test in the editor and did not require to launch on the device. Is this not possible anymore with the MetaXR plugin? Am I missing something obvious or any required extra steps? Thanks in advance, Simone6KViews1like4CommentsOculus Link + Windows Build + Handtracking + Unity
Hello I know that you can have handtracking inside unity editor, but is it possible to: 1) Build app (just to windows platform) with hand tracking support in project 2)Then connect quest by oculus link 3) Run the app and have hand tracking? I was testing and experimenting and it doesnt seem to work. What is interesting if I move my hand outside the guard area it detects it. I guess that it is not supported yet or will never be. Anyone knows something about it?5.9KViews1like11Comments[Hand Tracking] FingerTipPokeTool Index does not follow my index
Hello everyone. I created a simple scene with some hand tracking and added a "Interactable Tool Creator". It creates some finger tip poke so I can interact with buttons but the tip does not follow my index finger tips. Here is a video recorded on the Oculus Quest and a screenshot of my project scene. Video : https://imgur.com/a/snM3ms65.7KViews0likes12CommentsCannot open HandsTrainSample because Engine modules are out of date.
I wanted to try out Hand Tracking with UE4, and I just did git clone the engine source from Oculus github, and build the engine following the instruction in the Oculus website. I cloned 1e09d2f (the latest one) from branch 4.25, which is release "oculus-4.25.3-release-1.51.0-v19.0" and built the engine using Visual Studio 2019. It worked fine, I could make a new project. I could also open HandSample project and it works great with Oculus Quest & Link. But I couldn't open HandsTrainSample. I built the engine and clicked Debug/Start a new instance in Visual Studio, then it opened up the editor. Then I chose the sample, but I couldn't open it. I also tried to open .uproject directly but it just says "The following modules are missing or build with a different engine version: HandsTrainSample Would you like to rebuild them now?" and if i go Yes, then I get Missing Modules error with the message "Engine modules are out of date, and cannot be compiled while the engine is running. Please build through your IDE." and I couldn't open it. I guess I have to find and fix errors in VS but I don't even know where I can get .sln file for this project so which I need to see errors in VS. I tried Generate Visual Studio project files from HandsTrainSample.uproject but .sln file wasn't generated in the same directory although I got no error. This is my first time to build the engine from source and I might misunderstand some stuff. I hope someone helps me with it.4.7KViews1like5CommentsOVRGrabber and OVRGrabbable for hand tracking instead of controllers
Hi guys, I'm trying to get the OVRGrabber and OVRGrabbable scripts working on Unity using the hand tracking instead of the joysticks, but it seems that they were made for controllers only. Is there any example of how I can grab simple objects (cubes) by using or modifying these scripts? Any clue is welcome! :smile:4.2KViews0likes4CommentsSkeleton physical colliders pose not updated during hand tracking
I run the HandsInteractionTrainScene in Quest with Unity, and find that the skeleton physical colliders are not updated with hand tracking pose. I only turn on the Render Physics Capsules and allow Hand Tracking Support in OVRCameraRig. Does anyone know what's going on? You can see the footage in the attached zip file.Solved4KViews1like7CommentsHand Tracking Pointer Pose & Reset orientation and position [RESOLVED]
Hi, I am having some issues with using the Pointer Pose and the Reset Orientation and Position. Whenever I reset the orientation and position the pointer pose is not aligned with my forward direction anymore. The pointer pose keeps pointing in the same direction as before calling the Reset Orientation and Position. I have tried all sorts of work arounds and solutions but it seems that no matter what the pointer pose will always point in the direction of the VR room space. Is this to be expected? Or is there a solution to this? Thanks3.6KViews0likes2Comments