cancel
Showing results for 
Search instead for 
Did you mean: 

With OpenXR and the MSFT hand interaction extension, what's the correct way to detect "Menu" button?

rvkennedy
Explorer

I can see that in Quest, interaction_profiles/microsoft/hand_interaction is supported, and this lets me detect the actions /input/squeeze/value and input/select/value.

What I can't seem to do is detect when the hand is turned towards the user such that the "menu" button is visible.io.teleportvr.client-20231124-144550.jpg

What is the "correct" way to detect this, so I can handle the menu button with OpenXR?

3 REPLIES 3

rvkennedy
Explorer

Seriously? Nothing?

cube1993
Honored Guest

With the Quest 3, I got that action working using "/interaction_profiles/khr/simple_controller", binding the two "/user/hand/{left,right}/input/menu/click" paths.

pelouro
Explorer

In the Wolvic browser we detect that gesture when we stop getting valid values for aim. For devices not using the FB aim extension we do use the data from the hand joints to detect that. We haven't found a better way to do that.