With OpenXR and the MSFT hand interaction extension, what's the correct way to detect "Menu" button?
I can see that in Quest, interaction_profiles/microsoft/hand_interaction is supported, and this lets me detect the actions /input/squeeze/value and input/select/value. What I can't seem to do is detect when the hand is turned towards the user such that the "menu" button is visible. What is the "correct" way to detect this, so I can handle the menu button with OpenXR?2.5KViews0likes3Comments