With OpenXR and the MSFT hand interaction extension, what's the correct way to detect "Menu" button?
I can see that in Quest, interaction_profiles/microsoft/hand_interaction is supported, and this lets me detect the actions /input/squeeze/value and input/select/value. What I can't seem to do is detect when the hand is turned towards the user such that the "menu" button is visible. What is the "correct" way to detect this, so I can handle the menu button with OpenXR?2.5KViews0likes3CommentsOpenXR Hand Tracking EXT
So I've got code that uses the OpenXR hand tracking extension, and it works perfectly with HoloLens 2 and the Leap Motion layer. On Quest, I can see the extension, xrGetSystemProperties reports it as available, I have the manifest settings configured and the app does not ask me to switch to controllers, but XrHandJointLocationsEXT::isActive always reports false when calling xrLocateHandJointsEXT! Any thoughts as to what I'm missing here? Is there some additional configuration I may have missed, or is this perhaps still buggy? Engine is custom, StereoKit, so it's native code. I'm using the 2020-12-04 (OpenXR 1.0.12) release of the Oculus OpenXR Mobile SDK.Solved6.2KViews1like6Comments