cancel
Showing results for 
Search instead for 
Did you mean: 

Export space mesh scanned on quest 3?

tedats
Explorer

Hey,

To cut to the chase, I want to scan my entire house, then export the space so I can load it into a 3D model.

This is so I can take that low poly version of my house interior, and make my own 3D model around it. (Which would be super simple walls, etc)

I could go around my house measuring and make the model from scratch but having the scan to kind of trace from would be way too good. I don't have an iphone to do lidar scanning but even then I wouldn't want the images or a huge complex file. What the space scan does is insanely perfect.

I don't know if I'm asking a question or...does anyone have any advice for me? Maybe the community could advise me how I'd go about making an app that lets us do this? Maybe we can do this already by finding the file? I couldn't find anything that seemed like it would be the file for the space.

 

16 REPLIES 16

jbartlett777
Honored Guest

I’d like to know this too.

MetaQuestSupport
Community Manager
Community Manager

Hey there @tedats and @jbartlett777, the Meta Quest 3 does not currently support exporting this kind of information for external use. However you may be able to use developer tools and rig something up if you know how to code it. Regardless, this sounds like a great idea for 3d modelers, and we would love to see a request/suggestion made on our ideas Forums!

virtual_frog
Protege

This type of thing is called "photogrammetry" or "structure from motion". Since Meta will not allow access to their spatial data or cameras it is not possible to do such a thing with their headsets. You can use a regular camera and a 3rd party computer program to generate the 3d model which can be imported into a VR game. Reality Capture is a popular photogrammetry program and Meshroom is the leading open-source alternative.

ryu3382
Honored Guest

 you can scan your room if you have an iphone that with lidar sensor, my question is how to aligh the physical and virtual world together

You will have to do this manually in your app. Unfortunately Meta does not expose detailed spatial data for us to use. You might be able to use the low-resolution depth model to provide an initial guess, but the user will have to fine-tune the alignment.

Why does meta not allow access to spatial data for developers? That seems like an obvious thing to develop with given the hardware. I can't find an official statement saying "We don't do this because of a really good reason". Could meta clarify why this feature isn't available? And perhaps what could be done to make it available?

Privacy concerns about having data on the inside of your home. Same reason they don't allow camera access.

But we do this all the time. The app informs you that it will be collecting data. You can choose to allow/disallow access. Meta even asks you if you would like to share point cloud data with them to enable things like local multiplayer. When a developer downloads a development package to, say, generate their own point cloud, it doesn't mean it is automatically shared with Meta. This data could be used to power all sorts of apps. I agree that on a shipped app, it would need to inform the user of what the app intends to use it for and if it is sharing it with 3rd parties. I don't understand the privacy argument while developing an app at all. Could you (or perhaps Meta?) go into more detail? Maybe I am not understanding the argument fully.

I agree. Maybe they just haven't developed the SDK for it yet. They have a lot of new features to build and this is a very low priority. The low-resolution scene mesh is enough for most MR experiences I guess.