cancel
Showing results for 
Search instead for 
Did you mean: 

How To Build Hello_Xr sample with Android Version

HankCN
Level 2
Hi, I want to try how to write a VR APP with OpenXR SDK.
I found Oculus had published OpenXR Mobile SDK:
Actually, I don't know how to build hello_xr sample in OpenXR SDK for Android version.
Does anyone know how to build this sample with Oculus OpenXR Mobile SDK?
I appreciate your help.
13 REPLIES 13

@BattleAxeVR- thanks for sharing this sample with the community.

I haven't looked at it in great detail (it's a little hard to see your changes over the stock OpenXR hello_xr), but I think it is worth me saying that `XR_FB_eye_tracking_social` has a very specific use-case: supplying an app with eye orientations for the objective of social interactions. For interactions with objects with the environment it is best to use `XR_EXT_gaze_interaction`. For eye tracked foveated rendering, it is best to use `XR_META_foveation_eye_tracked`.

The release notes for our v46 SDK includes the following description:

> Introduced XR_FB_eye_tracking_social extension to enable enable eye tracking in social applications distinct from the existing XR_EXT_eye_gaze_interaction extension which is meant for enabling interaction with eye gaze. The XR_FB_eye_tracking_social extension provides you with the gaze direction of each of the eyes along with their respective positions. Because social applications rely on fixating the gaze on a point/person in space, the gaze output from both eyes is temporally smoothed and necessarily converges in front of the user.

In summary, while these extensions all use ultimately are based on the same raw data, the data presented to the user is optimized for the use-case for each of these extensions.

Thanks for the tip. I think the extension I used works fine for the actual purpose I used it for, which is to shoot objects out of my eyes and interact with the world, select objects, etc, but for foveated rendering (which I will implement next), it's better / cleaner if I can just get the screenspace X,Y coordinates of each eye directly to save a few instructions (and probably some error) projecting the gaze poses back from local space to screen space in 2D. Definitely will implement that other extension! thanks

Hi John, I tried following your suggestions to use XR_EXT_gaze_interaction or XR_META_foveation_eye_tracked extensions, but they aren't showing up in the list of extensions over Link / AirLink. Will I be able to access the unfiltered / raw eye coordinates so I can plug those in to my engine's foveated renderer?

Here are all the extensions the latest v47 firmware on Quest Pro, as well as public beta channel Oculus Home app report:

[20:48:31.440][Info ] Available Extensions: (46)
[20:48:31.440][Info ] Name=XR_KHR_D3D11_enable SpecVersion=9
[20:48:31.441][Info ] Name=XR_KHR_D3D12_enable SpecVersion=9
[20:48:31.441][Info ] Name=XR_KHR_opengl_enable SpecVersion=10
[20:48:31.441][Info ] Name=XR_KHR_vulkan_enable SpecVersion=8
[20:48:31.441][Info ] Name=XR_KHR_vulkan_enable2 SpecVersion=2
[20:48:31.441][Info ] Name=XR_KHR_composition_layer_depth SpecVersion=6
[20:48:31.441][Info ] Name=XR_KHR_win32_convert_performance_counter_time SpecVersion=1
[20:48:31.441][Info ] Name=XR_KHR_convert_timespec_time SpecVersion=1
[20:48:31.441][Info ] Name=XR_KHR_composition_layer_cube SpecVersion=8
[20:48:31.441][Info ] Name=XR_KHR_composition_layer_cylinder SpecVersion=4
[20:48:31.442][Info ] Name=XR_KHR_composition_layer_equirect SpecVersion=3
[20:48:31.442][Info ] Name=XR_KHR_visibility_mask SpecVersion=2
[20:48:31.442][Info ] Name=XR_KHR_composition_layer_color_scale_bias SpecVersion=5
[20:48:31.442][Info ] Name=XR_EXT_win32_appcontainer_compatible SpecVersion=1
[20:48:31.442][Info ] Name=XR_OCULUS_recenter_event SpecVersion=1
[20:48:31.442][Info ] Name=XR_OCULUS_audio_device_guid SpecVersion=1
[20:48:31.442][Info ] Name=XR_FB_color_space SpecVersion=3
[20:48:31.442][Info ] Name=XR_FB_display_refresh_rate SpecVersion=1
[20:48:31.443][Info ] Name=XR_OCULUS_ovrsession_handle SpecVersion=1
[20:48:31.443][Info ] Name=XR_OCULUS_perf_stats SpecVersion=1
[20:48:31.443][Info ] Name=XR_EXT_hand_tracking SpecVersion=4
[20:48:31.443][Info ] Name=XR_FB_hand_tracking_aim SpecVersion=2
[20:48:31.443][Info ] Name=XR_FB_hand_tracking_capsules SpecVersion=3
[20:48:31.443][Info ] Name=XR_FB_hand_tracking_mesh SpecVersion=3
[20:48:31.443][Info ] Name=XR_FB_body_tracking SpecVersion=1
[20:48:31.443][Info ] Name=XR_FB_eye_tracking_social SpecVersion=1
[20:48:31.444][Info ] Name=XR_FB_face_tracking SpecVersion=1
[20:48:31.444][Info ] Name=XR_FB_keyboard_tracking SpecVersion=1
[20:48:31.444][Info ] Name=XR_FB_passthrough SpecVersion=3
[20:48:31.444][Info ] Name=XR_FB_triangle_mesh SpecVersion=2
[20:48:31.444][Info ] Name=XR_FB_render_model SpecVersion=3
[20:48:31.444][Info ] Name=XR_FBX1_plane SpecVersion=1
[20:48:31.444][Info ] Name=XR_FB_spatial_entity_container SpecVersion=2
[20:48:31.445][Info ] Name=XR_FB_scene SpecVersion=1
[20:48:31.445][Info ] Name=XR_FB_spatial_entity SpecVersion=1
[20:48:31.445][Info ] Name=XR_FB_spatial_entity_storage SpecVersion=1
[20:48:31.445][Info ] Name=XR_FB_spatial_entity_query SpecVersion=1
[20:48:31.445][Info ] Name=XR_FB_spatial_entity_user SpecVersion=1
[20:48:31.445][Info ] Name=XR_FB_spatial_entity_storage_batch SpecVersion=1
[20:48:31.445][Info ] Name=XR_FB_spatial_entity_sharing SpecVersion=1
[20:48:31.445][Info ] Name=XR_FB_haptic_amplitude_envelope SpecVersion=1
[20:48:31.446][Info ] Name=XR_FB_haptic_pcm SpecVersion=1
[20:48:31.446][Info ] Name=XR_FB_touch_controller_pro SpecVersion=1
[20:48:31.446][Info ] Name=XR_FB_touch_controller_proximity SpecVersion=1
[20:48:31.446][Info ] Name=XR_FBX1_touch_controller_extras SpecVersion=1
[20:48:31.446][Info ] Name=XR_EXT_debug_utils SpecVersion=4

If Meta has no plans on letting me use the raw eye-tracking data on PC, I need to know before mid-January because I will return my Quest Pro for a refund. I develop for both PC VR and Quest and I can't justify keeping this device if I can't access all its features through code, including on PC. I bought Quest Pro specifically for eye tracking and while the social eye tracking extension works, if it's filtered and thus not suitable for implementing foveated rendering (as you said yourself), I can't justify keeping it, sorry. It also appears like Quest 3 may not even support eye tracking either so it seems pointless to bother with Quest Pro, to be honest. As much as I enjoy the headset I need it to work properly at this price.

I just want to be clear what I need: just the 2D screenspace X,Y coordinates, per eye of where each eye is focused currently. I know I can project the gaze vectors from the social extension to get those coordinates, but as you say, if it's filtered for social uses that will reduce its suitability for realtime foveated rendering. I also am not using rasterization, I need the raw 2D coordinates so I can feed it into my path tracer. Don't want / need any underlying platform-level rasterization-based foveated rendering, just the data. But the XR_EXT_gaze_interaction and XR_META_foveation_eye_tracked extensions are not listed and those are what I need, I think. (or just XR_META_foveation_eye_tracked). I'm not sure.

inbound8299057047362180939.jpg