06-29-2024 07:56 PM
Hi, fairly new to working with Unity and certainly new to working with it for developing VR/MR apps, but I wanted to dip my toe in the water with a VERY simple, almost proof of concept app that would work on the quest 3, and I'm running into a MOUNTAIN of trouble. I tried using meta AI to help fill in the gaps, but as I'm sure we all know it's very limited in the help it can give.
The short version is that I want to make an MR app that can look at surfaces like walls and tables, and detect a color on them. Let's say if I take a laser pointer and draw a quick line, I want to be able to see that red light on the surface, and have the app react to it. I've gone through some startup tutorials, and I have a very basic app that pulls surface data, but the "reading color from the camera" part is proving to be exceedingly difficult to even get started on.
From what I understand, I needed to attach a script to the main camera object that would take an image from the camera on the "update" function, parse through it looking for whatever color I choose, and then store the location on the surface where it found it (and then draw a line there or something in MR or apply a texture). I've been told that you can't really pull full raw camera data, because Meta hasn't worked out the kinks yet and thinks there's a privacy issue even if the entire app is local and all the data is processed without sending anything out to the internet, however supposedly I should be able to just pull lots of individual camera screenshots on every "update" call to get this done.
Any ideas from you more experienced developers out there? Did I pick some massively difficult thing to do as a beginner MR app?
Solved! Go to Solution.
06-30-2024 11:55 AM
The application can't see the camera feed at all. The regions of the app that show the real world are, to the app, just blank areas. The app and the camera feed are composited together outside of the application by Meta's OS, so any captures made inside the app won't capture the camera view.
06-30-2024 11:55 AM
The application can't see the camera feed at all. The regions of the app that show the real world are, to the app, just blank areas. The app and the camera feed are composited together outside of the application by Meta's OS, so any captures made inside the app won't capture the camera view.
06-30-2024 03:50 PM - edited 06-30-2024 03:50 PM
Thanks for the info. However that's really frustrating and really leaves out a lot of potential application/functionality for MR... Do you know if/when they're planning on fixing this oversight?
06-30-2024 10:59 PM
Not any time soon that I know of. This is pretty set in stone
07-02-2024 06:41 PM
Ah that's really terrible news... it rules out so many amazing experiences that could be crafted and really gives a lot of room for other headsets to beat out the quest. But thanks for the help