Forum Discussion
tedats
2 years agoExplorer
Export space mesh scanned on quest 3?
Hey,
To cut to the chase, I want to scan my entire house, then export the space so I can load it into a 3D model.
This is so I can take that low poly version of my house interior, and make my own 3D model around it. (Which would be super simple walls, etc)
I could go around my house measuring and make the model from scratch but having the scan to kind of trace from would be way too good. I don't have an iphone to do lidar scanning but even then I wouldn't want the images or a huge complex file. What the space scan does is insanely perfect.
I don't know if I'm asking a question or...does anyone have any advice for me? Maybe the community could advise me how I'd go about making an app that lets us do this? Maybe we can do this already by finding the file? I couldn't find anything that seemed like it would be the file for the space.
16 Replies
- jbartlett777Honored Guest
I’d like to know this too.
- MetaStoreHelpCommunity Manager
Hey there tedats and jbartlett777, the Meta Quest 3 does not currently support exporting this kind of information for external use. However you may be able to use developer tools and rig something up if you know how to code it. Regardless, this sounds like a great idea for 3d modelers, and we would love to see a request/suggestion made on our ideas Forums!
- virtual_frogProtege
This type of thing is called "photogrammetry" or "structure from motion". Since Meta will not allow access to their spatial data or cameras it is not possible to do such a thing with their headsets. You can use a regular camera and a 3rd party computer program to generate the 3d model which can be imported into a VR game. Reality Capture is a popular photogrammetry program and Meshroom is the leading open-source alternative.
- butterfly557Explorer
Why does meta not allow access to spatial data for developers? That seems like an obvious thing to develop with given the hardware. I can't find an official statement saying "We don't do this because of a really good reason". Could meta clarify why this feature isn't available? And perhaps what could be done to make it available?
- virtual_frogProtege
Privacy concerns about having data on the inside of your home. Same reason they don't allow camera access.
- OCog1616Honored Guest
This is not photogrammetry. The quest 3 uses lidar sensors and takes point cloud data to create a 3D visualization of a space. Photogrammetry is a series of photographs and then overlaid on each other and meshed to provide a 3D image.
- ryu3382Honored Guest
you can scan your room if you have an iphone that with lidar sensor, my question is how to aligh the physical and virtual world together
- virtual_frogProtege
You will have to do this manually in your app. Unfortunately Meta does not expose detailed spatial data for us to use. You might be able to use the low-resolution depth model to provide an initial guess, but the user will have to fine-tune the alignment.
Actually - the Meta Unity SDK *DOES* include a way to get a triangle mesh from the Quest 3 room scans, if thats all you need. It's really cool, actually - I tried it for the same reason as OP wanted - to pull it into a modeller (Blender?) and use the triangle mesh as a guide. Unfortunately, though, it only gives triangles, no color or camera scan data. Also - I'm not sure how you would go about getting your whole HOUSE scanned, but just your immediate scene/environment is simple. There is a script in the Unity Meta SDK that allows you to get prefabs returned for each scene primitive that has been setup in the Meta Scene setup - each, window, door, table, lamp and even generic volume or plane. One of the primitives you can get a prefab returned for it called something like '???_TRIMESH' - and if the prefab is setup correctly, it will be returned containing a MeshRenderer and Mesh, where the buffers contain a triangle mesh from the scene scan. You can then use a script to pull the vertices and triangle indices from the mesh. Let me know if this sounds like what you want - I'll be happy to refresh my memory on the exact scripts and functions if you cant find them.
- zach.actuallyHonored Guest
Hey Jeff, just stumbled on this thread and this sounds exactly like what I'm looking for for my own personal project. Any chance you can let me know those scripts? I'd really appreciate it!
- Here's a simple high-level overview, and a link to the Meta documentation where I learned it.Add an OVRSceneManager script to an empty (or any) game object. - this script will map the scene primitives (lamps, windows, walls, floors, room mesh, generic volumes, etc) to prefabs that you create (attach meshes and MeshRenderers if you want them visible, or create them with just colliders if you just need them for physics collisions, etc). For Meshes - if you keep it a unit cube, and unit square (1 meter in size) and set the parental heirarchy correctly, the Meta SDK will scale them when it instantiates them to be the correct size of the real object/feature. This does not apply to the GLOBAL_MESH - the triangles are inherently scaled in the vertex data. Also - all prefabs will need to have a OVRSceneAnchor component added.Add an OVRSceneModelLoader script to the same object - this script will automatically instantiate one of the prefabs you created for each scene primitive/type you have setup in the device for your current location when the program starts up.For the two generic primitives (Plane and Volume), you can, but don't have to, define prefabs and drag them to the script entries. Any generic volume or plane you setup in the room, will be represented by these prefabs.For specific primitives FLOOR, WINDOW, LAMP, TABLE, *GLOBAL_MESH*, etc., create a prefab, pick which primitive type it represents and add it to the "Prefab Overrides" array. These specific primitives will be replaced in your scene with these prefabs. Note that the GLOBAL_MESH primitive has a few special components (related to meshes and colliders) that you must include in the prefab to have the triangle mesh data instantiated on it by the OVRSceneModelLoader script - these are outlined toward the article linked below, under the heading "Scene Mesh"Specifics and more details can be found here:Hope that gets ya started. I'm happy to help further if you have more questions.
- skiburHonored Guest
Currently, there's no API that expose video frame directly. However, using Scene Mesh API, you can export the triangles it produce into an exportable mesh from your application.
- darryn.465Explorer
https://jasonharron.github.io/ has a json export which you can load in the three.js editor.
That can then be exported to a whole series of other 3d formats based on a short peruse of the editor interface.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 8 months ago
- 8 years ago
- 10 months ago