Forum Discussion
Theformand
6 years agoProtege
Handtracking and Unity UI
Hey everyone.
Since we dont have hand tracking in-editor via Link (Guys please tell me this is coming, dont see how handtracking development will take off without it), I´m working on a small tool app to record and serialize hand poses for pose recognition in a game.
So, I would really like to have a way to interact with Unity UI. Raycast and click, thats all I need. I tried making my own HandTrackingRaycaster to use with a Canvas, but it doesnt seem to be working. Is there really not a built-in component for this?
Since we dont have hand tracking in-editor via Link (Guys please tell me this is coming, dont see how handtracking development will take off without it), I´m working on a small tool app to record and serialize hand poses for pose recognition in a game.
So, I would really like to have a way to interact with Unity UI. Raycast and click, thats all I need. I tried making my own HandTrackingRaycaster to use with a Canvas, but it doesnt seem to be working. Is there really not a built-in component for this?
13 Replies
Replies have been turned off for this discussion
- TheformandProtegeFigured it out. On startup, find the OvrInputModule and set its rayTransform to OvrHand.PointerPose. Also find OvrRaycaster and set its .pointer to be OvrHand.PointerPose. Now you can interact with UI
- QuacklinMHCP MemberCan you give me a hint where to search for OvrInputModule ?
I tried adding it to the OVRCameraRig. But when I want to set it's rayTransform to OvrHand.PointerPose I get a error. - TheformandProtegeWhen you create a Unity UI, it will create an event system for you in the scene. On The event system, remove the Standalone Input Module, and replace it with OVR Input module.
- QuacklinMHCP MemberOkay I have added the OVR Input Module to the Event system and deactivated the Standalone Input Module. Then I added the OVR Raycaster to my world space canvas and deactivated the Graphic Raycaster.
When you wrote to assign the OvrHand.PointerPose to the OvrInputModule rayTransform I thought it's the gameobject HandRight or HandLeft which gets spawned by the Hands prefab. The Hand prefab does have a hand.cs script but they don't provide a PointerPose transform variable.
So i need somewhere the OVRHand.cs script with left or right hand selected but I have no Idea where to put it. :#- AltairEONExplorer
I was using the Hands prefab back in the day since I only had the old oculus plugin and using unity 2018. So the goal was to override ray transform property of OVRInputModule which is in the event system and the ray transform of the OVRGazePointer which you can access via the static class instance. To solve these using the old hands prefab way we need to do it by code...
You need to get the ray transform from the ray tool. The ray tool is spawned dynamically by the InteractableToolsSDKDriver prefab. I marked the right ray tool with a tag so i can access it later.RayTool.cs public void Initialize() { ... if(IsRightHandedTool) { gameObject.tag = "RightRayTool"; } }
I then expose the target ray from RayToolView.cs which can be accessed from RayTool.csRayToolView.cs public Transform GetTargetTransform() { return _targetTransform; } RayTool.cs public Transform GetRay() { return _rayToolView?.GetTargetTransform(); }Then I use it in my overrider class accordingly
Overrider.cs in awake() and make sure tag is in tagasset... var rayTool = GameObject.FindGameObjectWithTag("RightRayTool")?.GetComponent<RayTool>(); coroutine some time later when hands are ready... OVRGazePointer pointer = OVRGazePointer.instance; pointer.rayTransform = rayTool.GetRay(); ovrInput.rayTransform = rayTool.GetRay(); //NOTE: ovrinput is referenced from the component in eventsystem object
- TheformandProtegeI´m using the OTHER set of hand prefabs (Its a totally messy release by Oculus) that have OVRHand.cs attached to them. Im NOT using the Hands prefab to spawn them.
- QuacklinMHCP MemberGot it.
But I think I have to learn some fundamentals about ui Ray casting first. I though I could use the Ray tools like in the train demo scene. - 17786578401Honored GuestCan you be more specific? I want to see the whole process
- QuacklinMHCP MemberI started using the windows Mixed Reality Toolkit since there is a integration for the quest.
There is already native unity UI interaction included plus a lot of useful features like grabbing, scale and rotate
github.com/provencher/MRTKExtensionForOculusQuest - AnonymousHi guys, I'm a little new with Unity, but not with programming so I'm doing very well. Anyway, I would really appreciate if you could explain this a bit more specifically :( Pleaaaaase. This thread it's exactly what I was looking for, but I don't seem to make it work, or I don't understand how. Thanks guys!
- MarsAstroHonored GuestI've got an EventSystem with the OVRInputModule and a Canvas with the OVRRaycaster, and on my right OVRHandPrefab I've put a script with this code:void Start(){Hand = GetComponent<OVRHand>();InputModule = FindObjectOfType<OVRInputModule>();Raycaster = FindObjectOfType<OVRRaycaster>();InputModule.rayTransform = Hand.PointerPose;Raycaster.pointer = Hand.PointerPose.gameObject;}
However, it doesn't seem like I can interact with the buttons on the Canvas. Any idea what I'm doing wrong?
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 year ago