Adding ray pointer from Oculus Hands to interact with UI
I am trying to add a ray pointer from Oculus Hands to interact with UI. I managed to get the ray as shown in the screen capture below. However, as you can see, the laser pointer starts from the wrist position and aligned towards the right instead of pointing straight. Appreciate your suggestion if you know how can I correct this to point forward? Also, this laser pointer appears only on the right hand. Is there a way to change this? Please see my settings in Unity below.4.8KViews1like5CommentsUnity - Hand tracking disappear forever when adding OVRGrabber
So I was trying to implement a grabber with the Hand tracking but i didnt know why my hands were disappearing at some point in the development. Then I realized that when i added the OVRGrabber to the hands, those disappear from the scene, but they are still there cause I can touch the Guardian system with my hands. So when I delete the OVRGrabber the hands were still invisible. Im sure that the OVRGrabber is causing this, cause i were compiling step by step to catch where the hands were disappearing. Btw im using just the OVRCameraRig prefab, the HandsManager and the OVRHandPrefab. EDIT: So i figured out that you can solve this problem deleting the OVRHandPrefab and grabbing it under the controller anchor again and it will fix... but i dont know why you cant put the OVRGrabber to the OVRHandPrefab...1.8KViews0likes1CommentHand Tracking with Oculus Link
Soo, I know you're able to use hand-tracking in the Unity editor via Oculus Link, but how come you're not able to use hand tracking with a build targetted for windows? It's clearly possible considering you can have hand tracking in the editor, surely this is not something Oculus would not want developers to be able to utilise? I'm currently working on a project where a user is on the PC, whilst another user is using the Oculus Quest 2 (to save networking) and they can interact with each other, but hand-tracking is also needed in this project and I'm not sure there's any way around it other than having to network a sideloaded application on the Quest 2... Any suggestions?1.2KViews2likes1CommentUnity Custom Mesh Oculus Quest Hand Tracking
Hi everyone! It is days I'm looking for some documentation to replace the default oculus hand model with a custom rigged one. All I can get is just a weird solution: fingers bending in the wrong direction, wrong magnitude or even worse not bending at all. Is there someone who already managed this problem and has any tip to share? Thanks843Views0likes0CommentsHand Tracking Teleport
Hi, When using the hand tracking to select and object similar to the train example, selecting the windmills, is there a way to make the selection point to be larger? i.e. If I have a huge game object to select, I have to set my raycast cursor towards middle of the object to be able to select it and not just the collider. Any easy way to set this up? thank you1.5KViews0likes3CommentsIs it possible to switch fingers while using Oculus Quest hand tracking in Unity?
Hi everyone! I am developing an application for the Oculus Quest using Unity and the official SDK. One of the things I want to be able to do in the application is to trick the user a bit by switching finger correspondence. For example, when he moves his index finger, the virtual model will move the middle finger in the exact same way. Can I do that? I have tried switching the bones in the Hand Tracking prefab but it just defaults to the regular ones when I run the application. I tried also reading through the scripts to see if I could change something in there but nothing looked obvious (maybe I missed it?). My last solution would be to try and change things through LateUpdate but that is not what I am aiming for. Is there another solution? Thank you very much!502Views0likes0CommentsOculus Link + Hand Tracking limitations?
Are there any hardware (or any at all) limitations that would prevent hand tracking from working in Oculus Link through the Unity editor? I've been using it for a month or two now, but a teammate of mine with a very similar machine/setup is unable to make it work. He's had zero issues using Link aside from hand tracking, and it's becoming very anti-productive trying to troubleshoot. Both of us are working on the same project, using the same headset firmware and the same oculus software version. I suspect the only actual difference is the cable being used. He's had no other issues using Unity + Link aside from this, so I'm curious if hand tracking (in Link or the Unity Editor) is somehow limited to only the official Oculus Link Cable or if there's a similar limitation that isn't mentioned in any tutorials.878Views0likes1CommentHands/controllers showing up but not moving
So I have hand tracking basically working in the main menu of my app - the hands show up, track nicely, I can switch back and forth between hands and controllers in the main menu. Transitioning into a level works as well, and I can switch between hands and controllers in level. However, when I go back to the main menu, things break for some reason. The loading icon stays visible even though it has been deactivated in the hierarchy, and although I can actually still switch between hands and controllers and the fingers still track, neither the hands nor controllers will move from their spots. The controllers show up where I placed them last, and the hands show up in a strange horizontal position, with one on top of the other and palms facing each other. Sometimes other positions/locations though. Has anyone seen this problem before? I am using Unity version 2017.4.17f1, with I believe the newest version of Oculus SDK.813Views0likes0CommentsFront Facing Cameras and Hand Tracking
Hi I have a few questions about the FFC (Front facing cameras) and Hand Tracking. Our app is made for Oculus Quest and we are using Unity to develop our app. The question is can we access the FFC images to better the hand tracking or completely implement our own hand tracking. The normal hand tracking that comes with the Unity plugin is not cutting it for us and we have had to create workarounds. My questions: 1) When the hands go on top of each other and the cameras lose sight of the hands they end up in the wrong transform.position. How can I access the hand tracking data from inside Unity to see what is happening just before this happen. I cant seem to find the right place from inside OVR to get hand tracking data other than controller data. What points does the front facing cameras map and where can I access it? 2) Can we access the FFC:s in Unity? This was done fairly easy on HTC Vive Pro but I can not find anyway to grab the images from the Front facing Cameras. I would like to get the OVR data to implement my own code for hand tracking and environment mapping and for the QR code finding in our test area. I would really appreciate if you could point me in the right direction or answer me directly. Best Regards Jagi953Views1like0Comments