Forum Discussion
I saw some people posting here about using the camera feed to help users with vision accessibility, here is an interesting device called Lumen Glasses working on this - https://www.dotlumen.com/glasses
I know the Lumen Glasses device is dedicated towards accessibility use and probably has a very good LiDAR or ToF sensor, but, hopefully, this will be another example that shows the importance of having access to raw sensor data and nudge Meta to allow developers access to the camera feed.
- ginestopo2 years agoProtege
Thanks a lot for the support! Blind users are going to be the most beneficiaries of this feature. I hope all this comments are taken into consideration. But unfortunately at the moment, there seem to be no response from Meta.
- seeingwithsound2 years agoProtege
Thanks for mentioning the .lumen glasses. I similarly have a need to process the raw passthrough camera feed for blind accessibility in order to synthesize a sonic overlay in the form of "soundscapes", as a general sensory substitution technology for totally blind people, see https://www.seeingwithsound.com/android-glasses.htm Several commercially available smart glasses provide pixel-level access to third-party apps like mine, but not the Meta Quest 2 and 3 headsets, while that would be great for, for instance, training at home. I can install The vOICe APK file on my Quest 2 and it runs, but it just cannot connect to the passthrough camera view and will therefore give up after 30 seconds.