05-22-2024 07:25 PM
When you open up a mixed reality app on the Quest 3, it has you take a scan of your space so the game knows where to place things in the 'real world'. When I saw this happening for the first time I was mesmerized. The room I was in was very quickly translated to untextured polygons and only got more accurate the more I looked around things, for example, my fan went from a large cylinder, to an extremely accurate render of my fan as I went above and around it. I would love to see if the Quest 3 could even house a photogrammetry app, seeing as the depth sensors could make seriously accurate models in real time. If It did exist, I personally wouldn't change a thing about the model rendering portion of it, but to add a texture mapping second part would also be so cool. This is just an idea as I have no idea how to code, but I'm not above starting if someone could point me in the right direction because well, I have no idea where I'd start with making software for the Quest 3.
05-23-2024 08:29 AM
Meta doesn't give app developers access to the RGB cameras. Even so, you can get much higher quality results even just on your phone. Niantic's Scaniverse and Epic's RealityScan are both decent apps for phone scanning, but the former is only on iOS. For even higher quality results, you could take normal photos on your phone and later process them in Epic's RealityCapture which recently became free for anyone making less than a boatload of money off of gamedev.