cancel
Showing results for 
Search instead for 
Did you mean: 

Question about Meta Quest 3 SDK data usage

Daxter
Explorer

Hello guys!

Before anything, let me put you in context, I'm currently working in my master's thesis. This thesis involves the use of the Meta Quest 3 glasses with some external tools. 

In this thesis I'm researching the posibility of including VR glasses in a SLAM task. Basically, what this means is using the glasses hardware/software, recreate a real room into a virtual map. In addition to this, since I'm in the robotic's field, I will use the information to later on localize a robot into the room.

To be more clear, an approximation to what I'm trying to explain is usually what cleaning robots (Roombas) do to localize themselfs in a house. They first "scan" the house using SLAM with the help of some cameras and sensors.

The next image is an example of what a final virtual map would like as result of using SLAM.

 

maxresdefault.jpg

In this image, a tool called ROS have been used in order to help achieve this task. To summarize, ROS is something like an "enviroment" to help robot developers with their robot software. It also allows to reuse code that has been writed by other people allowing them to easily implement hard tasks like SLAM.

Basically, I have been trying to stream some data locally out of the glasses in order to use it in ROS. For what I have been searching, I'm aware data privacy in the glasses is really strong, so thats why I'm reaching to you guys.

I have beend doing the tests on Unity and I haven't managed to find any funcionality in the SDK that allows me to get any kind of info to data from the Lidar sensor or the depth of the camera.

So, do you guys know if the topic im trying to achieve is possible? Or should I rethink about implementing the glasses?

Thank you very much for your time and attention.

5 REPLIES 5

jtriveri
Expert Protege

You can get depth using the Depth API.

I know you can get these depth frames on PC too through Oculus Link. 

Here are the (Unity) docs for it.

https://developer.oculus.com/documentation/unity/unity-depthapi/ 

I’m not sure where the native OpenXR docs are. 

Thanks for your reply, I will try this.

Did you find a solution to this? I'm looking to get the depth data using Native OpenXR

Hi, this is what I have found related to exporting the room mesh scan:
This app made with WebXR: https://jasonharron.github.io/

This demo using the Scene API for Unity: https://github.com/oculus-samples/Unity-Phanto

Haven't really researched on the native OpenXR Scene API, but here are the docs related to the Scene API for OpenXR: https://developer.oculus.com/documentation/native/android/mobile-scene-api-ref/

Hope this helps!

 

I've looked at them, however they don't quite fit my needs and neither what you've originally commented, to stream depth sensor data. Using Depth API either for Unity or Native, with the function ReadPixels() it should read the data of the FrameBuffer from the GPU to the CPU, so it should be able to obtain the depth sensor data stored there, however only a black (Native) and grey (Unity) image is retrieved. I'm suspecting as you've commented that it's something related to privacy reasons and Meta doesn't allow developers to access the sensors directly.

If anyone has further information on this topic it would be much appreciated.