Showing results for 
Search instead for 
Did you mean: 

How to mix usage of ARFoundation and Meta XR SDKs?

Honored Guest

I'm developing a new app that needs to run on Apple Vision Pro, Meta Quest Pro, and Meta Quest 3.

I've got the application running successfully on each of the platforms using ARFoundation APIs and the Meta OpenXR SDK.

Now I'm looking to incorporate Quest Pro's eye and face tracking capabilities. Is it possible to do that without switching over the existing features (head tracking, hand tracking, interactions) from OpenXR to the `com.meta.xr.sdk` equivalents?

I'm particularly confused because the official Movement sample repo says "Unity-Movement is a package that uses OpenXR’s tracking layer APIs to expose Meta Quest Pro’s Body Tracking (BT), Eye Tracking (ET), and Face Tracking (FT) capabilities." which sounds like I should be able to get the tracking data via OpenXR APIs. But it sounds like I can't, I have to use the `com.meta.xr.sdk.interaction` APIs.

What's the right way to integrate Meta-specific SDKs on top of a common ARFoundation core?


Honored Guest