Oculus Link + Windows Build + Handtracking + Unity
Hello I know that you can have handtracking inside unity editor, but is it possible to: 1) Build app (just to windows platform) with hand tracking support in project 2)Then connect quest by oculus link 3) Run the app and have hand tracking? I was testing and experimenting and it doesnt seem to work. What is interesting if I move my hand outside the guard area it detects it. I guess that it is not supported yet or will never be. Anyone knows something about it?5.8KViews1like11CommentsOculus Integration hand grab not working
I set rigidbody,box collider,hand grab interactable,grabbable,interactable group view and interactable debug visual to a cube. When my hand reaches the cube and grab it, the color of the cube changes, meaning that the cube has detected grabbing behaviour correctly. However, the cube cannot move. The newest integration version has no grabbable variable in HandGrabInteractable, so the Grabbable component is not referenced and useless. Probably it's wrong. How can I fix this? (In the old version, you can drag Grabbable component to the corresponding variable in HandGrabInteractable and the cube moves when being grabbed.2.6KViews1like3CommentsUnable to use the hand tracking menu function
I found a few posts asking about this but none of them went anywhere or solved the issue. Your left tracked hand has a menu function that loads up on a small circle. Is there any way to actually use that in app? Is there a callback or a button press to detect for the input? I would like to open a menu using that interface but I cannot find an example anywhere of someone being able to use it properly. I have gesture detection working for pinching and other things but I would like to hook up to the Meta interface for having a delayed input on opening the menu. Any help would be much appreciated. I am in Unity 2020 using OVR v1.70 SDK38926Views1like0Comments[Hand Tracking] FingerTipPokeTool Index does not follow my index
Hello everyone. I created a simple scene with some hand tracking and added a "Interactable Tool Creator". It creates some finger tip poke so I can interact with buttons but the tip does not follow my index finger tips. Here is a video recorded on the Oculus Quest and a screenshot of my project scene. Video : https://imgur.com/a/snM3ms65.7KViews0likes12CommentsHow to get the poked point in OVR handtracking?
I want to add a screen(actually cube) im the scene, and when I hand pokes the screen, it will store the poked coordinate of its local system in a variable.How to get the coordinate when it is poked?I can't find any API to do this. Background:Gonna to make a VR remote desktop, and I'm using the oculus integration SDK.823Views0likes0CommentsHow to avoid passing through of inputOVR hands in oculus integration?
I applied rigidbody withoud kinematic to hand visual and objects, and I also modified the script HandPhysicsCapsules to change the kinematic of the rigidbody component of generated capsule colliders to false. However, when I ran the demo, hands themselves can't pass through but they can pass through objects(though objects also interacts with hands, i.e. they are hit and moved).And when I hands were put on the table and move down, it just passed through the table instead of stucking on the table.1.1KViews0likes0CommentsHandtracking app - true recenter API for Unity
Hi all! I'm making an hand-tracking game as a seated or standing experience. While standing, the user sometimes move around a bit, after time its offset causes the game to act weird for obvious reasons. While recentering with controller is very easy and intuitive, doing so with hand tracking is pretty complex, therefore I'm doing an automatic recenter if an offset is detected. It works with Link but not with Quest 2. Is there a way to do a recenter in Unity without moving the whole game objects in the scene? In Unity, this line doesn't work for the Quest 2 (does work for Link): OVRManager.display.RecenterPose(); With controllers it's ok, I guess, but for hand-tracking I believe we need a better solution than disabling this option. Alternatively - is there a way to mark the experience as seated/standing and make this line work for the Quest 2? Many thanks! Similar to this subject1.2KViews1like0CommentsOVRGrabber and OVRGrabbable for hand tracking instead of controllers
Hi guys, I'm trying to get the OVRGrabber and OVRGrabbable scripts working on Unity using the hand tracking instead of the joysticks, but it seems that they were made for controllers only. Is there any example of how I can grab simple objects (cubes) by using or modifying these scripts? Any clue is welcome! :smile:4.2KViews0likes4CommentsHow to build a multiplayer hand tracking with Oculus Quest?
Hi I would appreciate it if someone can point me to a tutorial/documentation that talks about multiplayer hand tracking with Oculus Quest? Daniel's video on SideQuest was the only app that I see similar implementation. However, I did not see any tutorials/documentation on this. I am a bit stuck on this task and couldn't find help. I see good tutorials on how to use VR controllers over the network but not on using Oculus hand tacking in a network setup. Any insight on this would be highly appreciated. Thanks heaps.1.9KViews0likes3Comments