Forum Discussion
rvkennedy
2 years agoExplorer
With OpenXR and the MSFT hand interaction extension, what's the correct way to detect "Menu" button?
I can see that in Quest, interaction_profiles/microsoft/hand_interaction is supported, and this lets me detect the actions /input/squeeze/value and input/select/value.
What I can't seem to do is detect when the hand is turned towards the user such that the "menu" button is visible.
What is the "correct" way to detect this, so I can handle the menu button with OpenXR?
3 Replies
- rvkennedyExplorer
Seriously? Nothing?
- cube1993Honored Guest
With the Quest 3, I got that action working using "/interaction_profiles/khr/simple_controller", binding the two "/user/hand/{left,right}/input/menu/click" paths.
- pelouroExplorer
In the Wolvic browser we detect that gesture when we stop getting valid values for aim. For devices not using the FB aim extension we do use the data from the hand joints to detect that. We haven't found a better way to do that.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago