Forum Discussion

Jeffrey_z04's avatar
Jeffrey_z04
Honored Guest
11 months ago

Ray-Ban feature suggestions for visually impaired users

I am a visually impaired user who got a pair of Ray-Ban glasses a month ago. The glasses have been very useful in my everyday life, but there are a few things that could improve them and make more visually impaired people purchase the glasses. 

Barcode recognition: It would be cool if we could use Meta AI to scan barcodes for products where we could ask questions about the product that we are holding. Such as the price, nutritional facts, and cooking instructions. A similar feature exists on an app called SeeingAI, but implementing this feature on the Ray-Bans would help so many blind people by allowing them to gain more information while they do their shopping without having to hold an accessibility device in their hands. 
This last thing isn’t a feature suggestion but more of an issue that I have with the glasses. I constantly try to use Meta AI to read me documents but it always summarizes what’s on the page instead of reading it to me. I find that I have to change my wording and ask follow up questions to get it to do what I want. This isn’t a Meta specific issue as I have had this problem with other AI’s. For not being designed to aid the visually impaired, the glasses have been very beneficial!

2 Replies

  • My name is Magali Rossi and I’ve been using the glasses since September 2024. When I bought them, all the functions were in English, but I knew the product could still be useful for me even though my English is very poor, since I asked the artificial intelligence to translate the responses. I have to mention several things about this, because in Argentina, Meta’s artificial intelligence doesn’t work, and I have to use a VPN as if I were in the United States. That’s a limitation, since not everyone can afford a VPN, but I’m still making an effort to keep using the glasses.

    In terms of independence, they give me a lot, since WhatsApp video calls help me solve certain situations on the street, for example, at a difficult crossing that I can’t do with my guide dog because of traffic. If there’s no one else around, I can call someone and have them assist me. Also, with the “look and tell me what you see” function, I get quite a bit of information about what’s in front of me, like reading signs, describing shop windows, landscapes, even the colors of clothes, and reading product labels. Although I’d also like it to read expiration dates accurately—there are some errors there—but the glasses are still very useful. Also, when I ask it to summarize a text, sometimes I can figure out which legal papers I need to organize at work. I wish this was available as OCR and in Spanish so I could do it more comfortably. I’d also like it to be able to read money, since it’s sometimes very difficult to handle that here due to accessibility issues.

    The glasses are also great for keeping my hands free when making phone calls, calling a Be My Eyes volunteer, or sending WhatsApp messages to groups. I really enjoy using them as a sound device, like headphones. I also create a lot of content for social media, since I work in communications, and as a visually impaired person who shares how we carry out our daily activities, they help me a lot. Before, I couldn’t record my walks with my guide dog or film myself cooking, and now I can, which is really important.

    I’d also like the glasses to have a memory function to remember objects like bills or the face of someone we care about, and to be able to read full pages of books, since academically, we sometimes don’t have access to books in digital format, including classic literature. I hope this product continues to improve with universal design, with access to all features in Spanish in Latin American countries and Spain, and with unrestricted AI. The glasses aren’t unaffordable, and for those who can’t buy specialized glasses for blind people, they’re a great option. I also hope integration opens up with other assistive apps, like Oko for traffic lights and Navilens for orientation codes, or even Cash Reader, since using the glasses’ camera would be amazing.

    I hope everything keeps improving and that Meta glasses keep changing the lives of people with visual disabilities.  
    Best regards,  
    Magali Rossi

    • magali.rossi.438769's avatar
      magali.rossi.438769
      Honored Guest

      ojalá pueda darse de que meta escuche a los usuarios en español y en inglés que nos interesa mejorar este producto