11-24-2023 06:47 AM
I can see that in Quest, interaction_profiles/microsoft/hand_interaction is supported, and this lets me detect the actions /input/squeeze/value and input/select/value.
What I can't seem to do is detect when the hand is turned towards the user such that the "menu" button is visible.
What is the "correct" way to detect this, so I can handle the menu button with OpenXR?
02-14-2024 03:39 PM
Seriously? Nothing?
05-14-2024 06:08 AM
With the Quest 3, I got that action working using "/interaction_profiles/khr/simple_controller", binding the two "/user/hand/{left,right}/input/menu/click" paths.
06-03-2024 01:53 AM
In the Wolvic browser we detect that gesture when we stop getting valid values for aim. For devices not using the FB aim extension we do use the data from the hand joints to detect that. We haven't found a better way to do that.