Adding ray pointer from Oculus Hands to interact with UI
I am trying to add a ray pointer from Oculus Hands to interact with UI. I managed to get the ray as shown in the screen capture below. However, as you can see, the laser pointer starts from the wrist position and aligned towards the right instead of pointing straight. Appreciate your suggestion if you know how can I correct this to point forward? Also, this laser pointer appears only on the right hand. Is there a way to change this? Please see my settings in Unity below.4.8KViews1like5CommentsUnity - Hand tracking disappear forever when adding OVRGrabber
So I was trying to implement a grabber with the Hand tracking but i didnt know why my hands were disappearing at some point in the development. Then I realized that when i added the OVRGrabber to the hands, those disappear from the scene, but they are still there cause I can touch the Guardian system with my hands. So when I delete the OVRGrabber the hands were still invisible. Im sure that the OVRGrabber is causing this, cause i were compiling step by step to catch where the hands were disappearing. Btw im using just the OVRCameraRig prefab, the HandsManager and the OVRHandPrefab. EDIT: So i figured out that you can solve this problem deleting the OVRHandPrefab and grabbing it under the controller anchor again and it will fix... but i dont know why you cant put the OVRGrabber to the OVRHandPrefab...1.8KViews0likes1CommentHand Tracking with Oculus Link
Soo, I know you're able to use hand-tracking in the Unity editor via Oculus Link, but how come you're not able to use hand tracking with a build targetted for windows? It's clearly possible considering you can have hand tracking in the editor, surely this is not something Oculus would not want developers to be able to utilise? I'm currently working on a project where a user is on the PC, whilst another user is using the Oculus Quest 2 (to save networking) and they can interact with each other, but hand-tracking is also needed in this project and I'm not sure there's any way around it other than having to network a sideloaded application on the Quest 2... Any suggestions?1.2KViews2likes1CommentUnity Custom Mesh Oculus Quest Hand Tracking
Hi everyone! It is days I'm looking for some documentation to replace the default oculus hand model with a custom rigged one. All I can get is just a weird solution: fingers bending in the wrong direction, wrong magnitude or even worse not bending at all. Is there someone who already managed this problem and has any tip to share? Thanks847Views0likes0CommentsHand Tracking Teleport
Hi, When using the hand tracking to select and object similar to the train example, selecting the windmills, is there a way to make the selection point to be larger? i.e. If I have a huge game object to select, I have to set my raycast cursor towards middle of the object to be able to select it and not just the collider. Any easy way to set this up? thank you1.5KViews0likes3CommentsIs it possible to switch fingers while using Oculus Quest hand tracking in Unity?
Hi everyone! I am developing an application for the Oculus Quest using Unity and the official SDK. One of the things I want to be able to do in the application is to trick the user a bit by switching finger correspondence. For example, when he moves his index finger, the virtual model will move the middle finger in the exact same way. Can I do that? I have tried switching the bones in the Hand Tracking prefab but it just defaults to the regular ones when I run the application. I tried also reading through the scripts to see if I could change something in there but nothing looked obvious (maybe I missed it?). My last solution would be to try and change things through LateUpdate but that is not what I am aiming for. Is there another solution? Thank you very much!505Views0likes0CommentsOculus Link + Hand Tracking limitations?
Are there any hardware (or any at all) limitations that would prevent hand tracking from working in Oculus Link through the Unity editor? I've been using it for a month or two now, but a teammate of mine with a very similar machine/setup is unable to make it work. He's had zero issues using Link aside from hand tracking, and it's becoming very anti-productive trying to troubleshoot. Both of us are working on the same project, using the same headset firmware and the same oculus software version. I suspect the only actual difference is the cable being used. He's had no other issues using Unity + Link aside from this, so I'm curious if hand tracking (in Link or the Unity Editor) is somehow limited to only the official Oculus Link Cable or if there's a similar limitation that isn't mentioned in any tutorials.882Views0likes1CommentHands/controllers showing up but not moving
So I have hand tracking basically working in the main menu of my app - the hands show up, track nicely, I can switch back and forth between hands and controllers in the main menu. Transitioning into a level works as well, and I can switch between hands and controllers in level. However, when I go back to the main menu, things break for some reason. The loading icon stays visible even though it has been deactivated in the hierarchy, and although I can actually still switch between hands and controllers and the fingers still track, neither the hands nor controllers will move from their spots. The controllers show up where I placed them last, and the hands show up in a strange horizontal position, with one on top of the other and palms facing each other. Sometimes other positions/locations though. Has anyone seen this problem before? I am using Unity version 2017.4.17f1, with I believe the newest version of Oculus SDK.815Views0likes0CommentsHand tracking with custom model - fingers bending the wrong way
Hi There! For the last couple of days I've been trying to figure out how to apply the Quest's hand tracking to a custom model. The OVR Custom Skeleton script seems to work great - but only for the right hand. The left hand bones are inverse resulting in the fingers move up when they should move down and left when they should be going right, which looks rather unpleasant. https://imgur.com/a/ZYV9iCb As far as I understand the OVRCustomSkeleton.cs script is used to map the custom bones while OVRHand.cs is responsible for transforming the bones's rotation. I'm pretty sure line 164 - 177 is where I should be looking but this is where I get stuck, every attempt results in either nothing changing or everything breaking. OVRSkeleton.SkeletonPoseData OVRSkeleton.IOVRSkeletonDataProvider.GetSkeletonPoseData() var data = new OVRSkeleton.SkeletonPoseData(); data.IsDataValid = IsDataValid; if (IsDataValid) { data.RootPose = _handState.RootPose; data.RootScale = _handState.HandScale; data.BoneRotations = _handState.BoneRotations; data.IsDataHighConfidence = IsTracked && HandConfidence == TrackingConfidence.High; } return data; Any help would be much appreciated!2.2KViews2likes2CommentsQuest and Unity built-in XR vs XR Plugin - detecting Quest hands as XRDevice
Hi! I'm currently looking to update my project from using the Unity built-in XR system, moving over to the new XR plugin system. I'm using Unity 2019.3.13f1 with all the latest respective packages (as of this week). With the current Unity built-in XR system, I can use the UnityEngine.XR.InputDevices.deviceConnected / deviceDisconnected callbacks, which will successfully indicate when a Touch device connects or disconnects. They also successfully respond with a device when a user's hand is detected - the device is still named "Oculus Quest Controller - (left/right)" but I can then query via the OVRPlugin to see whether the device is a Touch controller or a Hand using OVRPlugin.GetControllers(). After switching to the XR plugin system, I've found that I get callbacks for the Quest Touch controllers, but the callbacks don't fire for Hands - in fact, there's no XRDevice for the hands as far as Unity is concerned; UnityEngine.XR.InputDevices.GetDevices(...) only shows "Oculus Quest" (the HMD) as a device. Is this expected behaviour, or have I missed a setting / config somewhere that should enable the hands to show up as a Unity XRDevice? As far as I can tell, everything else is the same as my (previously working) built-in XR system version, i.e. the manifest is the same, etc. Any help you good folks can provide would be awesome.1.2KViews0likes2Comments