Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Daxter's avatar
Daxter
Explorer
1 year ago

Question about Meta Quest 3 SDK data usage

Hello guys!

Before anything, let me put you in context, I'm currently working in my master's thesis. This thesis involves the use of the Meta Quest 3 glasses with some external tools. 

In this thesis I'm researching the posibility of including VR glasses in a SLAM task. Basically, what this means is using the glasses hardware/software, recreate a real room into a virtual map. In addition to this, since I'm in the robotic's field, I will use the information to later on localize a robot into the room.

To be more clear, an approximation to what I'm trying to explain is usually what cleaning robots (Roombas) do to localize themselfs in a house. They first "scan" the house using SLAM with the help of some cameras and sensors.

The next image is an example of what a final virtual map would like as result of using SLAM.

 

In this image, a tool called ROS have been used in order to help achieve this task. To summarize, ROS is something like an "enviroment" to help robot developers with their robot software. It also allows to reuse code that has been writed by other people allowing them to easily implement hard tasks like SLAM.

Basically, I have been trying to stream some data locally out of the glasses in order to use it in ROS. For what I have been searching, I'm aware data privacy in the glasses is really strong, so thats why I'm reaching to you guys.

I have beend doing the tests on Unity and I haven't managed to find any funcionality in the SDK that allows me to get any kind of info to data from the Lidar sensor or the depth of the camera.

So, do you guys know if the topic im trying to achieve is possible? Or should I rethink about implementing the glasses?

Thank you very much for your time and attention.

6 Replies