Sorry to bump an old thread but I managed to get hello_xr running on Quest. It took me a while since I couldn't find any detailed/up-to-date instructions around and I'm pretty new to Android. I wrote up the process in case anyone finds it useful:
We have added documentation to the Oculus developer site for how to use hello_xr: https://developer.oculus.com/documentation/native/android/mobile-build-run-hello-xr-app/
Hopefully these docs will be useful for some folks.
If it's of any use to anyone out there, I forked the official OpenXR hello_xr sample to build using the openxr library (.so files) for Quest on standalone, as well as using the OVR SDK libraries on PC.
I also added rudimentary / basic implementations of other Meta OpenXR extensions such as Link Sharpening, local dimming (Pro), eye tracking (Pro), body tracking (all Quests, incl on PC but not yet working, as you know).
Some of which are Android-only (like the compositor flags for sharpening or the frame end struct to turn on local dimming), but hopefully this should help others out there who may be curious about cross-platform PC vs Android native C++ OpenXR functionality. I put in the CMAKE the include dirs for the Meta-specific extensions using your system environment variable OCULUS_OPENXR_MOBILE_SDK => D:\ovr_openxr_mobile_sdk_46.0 (for example, on my machine)
I also pre-built hello_xr.exe and pushed it in the root folder for anyone who may want to try eye tracking on PC using their Pro. There are pics of this working in the repo. I'm just using various cubes to show the poses / directions of the gazes using the views for each eye as local-to-world * the gaze directions returned by the XR extension. Also for body tracking, just rendering a single small cube at each joint. Works pretty well for hands + elbows and waist, but the feet appear to be fused together and no knees yet. (from what I could tell). I know you guys are working on this stuff and it's all experimental but it's very exciting stuff! It'd be great if all the Quests can have a decent, and most importantly reliable full body poses extracted just from controllers (or hand tracking) + hmd pose. Then we can all build apps that rely on it as a base assumption and that will really push the entire VR industry forward tremendously. Great job so far btw! looking forward to next steps.
I am curious about Local Dimming through Link / AirLink, is that enabled by default? Or will it ever be? LD isn't exposed as an OpenXR extension through Link yet, but honestly, I'd rather it weren't, and simple toggle-able through the debug tool like Link Sharpening is. So we can all use it (or not) in every game.