Forum Discussion
lou_d
2 years agoHonored Guest
How to mix usage of ARFoundation and Meta XR SDKs?
I'm developing a new app that needs to run on Apple Vision Pro, Meta Quest Pro, and Meta Quest 3.
I've got the application running successfully on each of the platforms using ARFoundation APIs and the Meta OpenXR SDK.
Now I'm looking to incorporate Quest Pro's eye and face tracking capabilities. Is it possible to do that without switching over the existing features (head tracking, hand tracking, interactions) from OpenXR to the `com.meta.xr.sdk` equivalents?
I'm particularly confused because the official Movement sample repo says "Unity-Movement is a package that uses OpenXR’s tracking layer APIs to expose Meta Quest Pro’s Body Tracking (BT), Eye Tracking (ET), and Face Tracking (FT) capabilities." which sounds like I should be able to get the tracking data via OpenXR APIs. But it sounds like I can't, I have to use the `com.meta.xr.sdk.interaction` APIs.
What's the right way to integrate Meta-specific SDKs on top of a common ARFoundation core?
1 Reply
Replies have been turned off for this discussion
- lou_dHonored Guest
Cross-posted to the Unity forums here https://forum.unity.com/threads/how-to-mix-usage-of-arfoundation-and-meta-xr-sdks.1596330/
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 4 years ago
- 10 months ago