Forum Discussion
HankCN
5 years agoHonored Guest
How To Build Hello_Xr sample with Android Version
Hi, I want to try how to write a VR APP with OpenXR SDK.
I found Oculus had published OpenXR Mobile SDK:
Actually, I don't know how to build hello_xr sample in OpenXR SDK for Android version.
Does anyone know how to build this sample with Oculus OpenXR Mobile SDK?
I appreciate your help.
I found Oculus had published OpenXR Mobile SDK:
Actually, I don't know how to build hello_xr sample in OpenXR SDK for Android version.
Does anyone know how to build this sample with Oculus OpenXR Mobile SDK?
I appreciate your help.
13 Replies
- truedatiotoneHonored GuestI am also looking into this, not clear why there isn't a more complete sample. I will post once I have a chance to re-review the instructions provided along with the SDK.
- bebop_devProtegeI also ran into issues setting up the hello_xr sample project. I noticed hello_xr is missing build instructions for Android in the BUILDING.md:
https://github.com/KhronosGroup/OpenXR-SDK-Source/blob/master/BUILDING.md
From the project layout of hello_xr it would appear that you would need the Visual Studio Mobile Development (with C++) tools, but this is not documented anywhere that I could find. I installed them and I tried to setup an Android Native project using the hello_xr source code, but I was getting build errors. - Anonymous
Sorry to bump an old thread but I managed to get hello_xr running on Quest. It took me a while since I couldn't find any detailed/up-to-date instructions around and I'm pretty new to Android. I wrote up the process in case anyone finds it useful:
https://gayanediriweera.github.io/code/2021/04/06/how-to-run-helloxr-on-oculus-quest.html
- MrGoodKillExplorer
Thanks a lot, no way I would have gotten the sample working on the quest without your blog post!
- shram86Honored Guest
Thanks so much Lepton!
- BattleAxeVRAdventurer
Over a year later and the official maintainers of that sample still haven't fixed it. SMH.
- johnkearneyMeta Employee
Hi all,
We have added documentation to the Oculus developer site for how to use hello_xr: https://developer.oculus.com/documentation/native/android/mobile-build-run-hello-xr-app/
Hopefully these docs will be useful for some folks.
John - BattleAxeVRAdventurer
If it's of any use to anyone out there, I forked the official OpenXR hello_xr sample to build using the openxr library (.so files) for Quest on standalone, as well as using the OVR SDK libraries on PC.
I also added rudimentary / basic implementations of other Meta OpenXR extensions such as Link Sharpening, local dimming (Pro), eye tracking (Pro), body tracking (all Quests, incl on PC but not yet working, as you know).
Some of which are Android-only (like the compositor flags for sharpening or the frame end struct to turn on local dimming), but hopefully this should help others out there who may be curious about cross-platform PC vs Android native C++ OpenXR functionality. I put in the CMAKE the include dirs for the Meta-specific extensions using your system environment variable OCULUS_OPENXR_MOBILE_SDK => D:\ovr_openxr_mobile_sdk_46.0 (for example, on my machine)
https://github.com/BattleAxeVR/OpenXR-SDK-Source - BattleAxeVRAdventurer
I also pre-built hello_xr.exe and pushed it in the root folder for anyone who may want to try eye tracking on PC using their Pro. There are pics of this working in the repo. I'm just using various cubes to show the poses / directions of the gazes using the views for each eye as local-to-world * the gaze directions returned by the XR extension. Also for body tracking, just rendering a single small cube at each joint. Works pretty well for hands + elbows and waist, but the feet appear to be fused together and no knees yet. (from what I could tell). I know you guys are working on this stuff and it's all experimental but it's very exciting stuff! It'd be great if all the Quests can have a decent, and most importantly reliable full body poses extracted just from controllers (or hand tracking) + hmd pose. Then we can all build apps that rely on it as a base assumption and that will really push the entire VR industry forward tremendously. Great job so far btw! looking forward to next steps.
I am curious about Local Dimming through Link / AirLink, is that enabled by default? Or will it ever be? LD isn't exposed as an OpenXR extension through Link yet, but honestly, I'd rather it weren't, and simple toggle-able through the debug tool like Link Sharpening is. So we can all use it (or not) in every game.- johnkearneyMeta Employee
BattleAxeVR- thanks for sharing this sample with the community.
I haven't looked at it in great detail (it's a little hard to see your changes over the stock OpenXR hello_xr), but I think it is worth me saying that `XR_FB_eye_tracking_social` has a very specific use-case: supplying an app with eye orientations for the objective of social interactions. For interactions with objects with the environment it is best to use `XR_EXT_gaze_interaction`. For eye tracked foveated rendering, it is best to use `XR_META_foveation_eye_tracked`.
The release notes for our v46 SDK includes the following description:
> Introduced XR_FB_eye_tracking_social extension to enable enable eye tracking in social applications distinct from the existing XR_EXT_eye_gaze_interaction extension which is meant for enabling interaction with eye gaze. The XR_FB_eye_tracking_social extension provides you with the gaze direction of each of the eyes along with their respective positions. Because social applications rely on fixating the gaze on a point/person in space, the gaze output from both eyes is temporally smoothed and necessarily converges in front of the user.
In summary, while these extensions all use ultimately are based on the same raw data, the data presented to the user is optimized for the use-case for each of these extensions.- BattleAxeVRAdventurer
Thanks for the tip. I think the extension I used works fine for the actual purpose I used it for, which is to shoot objects out of my eyes and interact with the world, select objects, etc, but for foveated rendering (which I will implement next), it's better / cleaner if I can just get the screenspace X,Y coordinates of each eye directly to save a few instructions (and probably some error) projecting the gaze poses back from local space to screen space in 2D. Definitely will implement that other extension! thanks
- StarkiumProtege
just wanted to pop in here and say I'm taking a look at the hello_xr demo as well. it's 2025 so lets see if all the build stuff is still valid.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device