Forum Discussion

Team.Thunderbear's avatar
30 days ago

A discussion on visual accessibility

I would like to have a discussion on how we can bridge the gap between VR and real life from a visual accessibility standpoint. Based on my experience with the Quest 3 and many games I believe that the needs of people with visual disabilities are often not being taken into consideration when designing experiences. This is especially notable when it comes to menus and other UI elements.

Anticipating the needs of someone who's reality you don't fully understand is very difficult so my aim here is to provide some insight and perhaps move the needle in the right direction. I know people with major visual impairments represent a relatively small percentage of the population but  they are no less deserving of being able to engage in these experiences. Additionally, making your projects accessible to these people would generate some extra revenue with what I think we'll find is minimal effort. Simply a willingness to understand and make decisions based on that understanding can be enough.

First, what are some of the major hurdles that people with visual disabilities need to overcome. While there are many types of vision loss, many can be aided in similar ways. The ability to order prescription lenses or wear your glasses is great and sorts out the majority of people. However, there is a large group for whom corrective lenses offer little to no benefit. Cases such as macular degeneration, glaucoma or various corneal or retinal issues. I myself have a form of macular degeneration and had an unfortunate accident a few years ago that took what was my better eye.

In these cases where corrective lenses are not a viable option there are a few things that help. One major aid is being able to magnify things or simply being able to move closer to them. This is where VR (at least currently) differs from something like a monitor. I can get closer to my monitor, or any object, I can hold them right up to my face to see them. In VR, however, this ability seems to be being consciously limited. In many cases menus and UI elements will just go white or disappear all together when you get near them. In other instances they are not static and will move away from you if you try to move towards them. This behavior is completely unnecessary and makes it very difficult for people like myself to navigate menus and UI elements.  The removal of this one behavior would be a major step forward in accessibility.

Now, you might ask, if you cant see the menus who can you play the game? Being able to recognize text and just the broad world are handled differently by your eyes and your brain. Text recognition is almost entirely performed by your central vision. So for people who have things like macular degeneration, which affects your central vision, reading can become very difficult while the rest of the experience is still doable.

As you can image, not being able to read the menus is a problem. It makes it very difficult to navigate, choose levels and can make any game or app that requires you to impute something completely unusable. While text to speech can aid this in the Quest UI, it does not work in games and even then using it is not an enjoyable experience. I don't even use it on my PC, I much prefer to zoom in or put my nose up to the screen than fumble around with text to speech.

Having a global zoom feature in the Quest UI would retroactively aid this issue in existing titles. It would be a game changing feature for many people but I don't think it should be where we end. There should be an accessibility framework for in game/app menus and UI elements. Let people grab the menus and bring them up to their face and/or make them static so someone can just walk up to them. Don't make menus go while or disappear when you get close to them, this is only a hinderance. Make menus that function with text to speech. Allow for larger fonts and different contrasts.

The easiest way to aid accessibility is simply to make VR more like real life without adding artificial barrierers that only serve to hinder. Let people interact with your world as they would in real life, some of us need to stick our face right beside something to see it. Also zoon, please Meta, a global zoom function would help more than I can express. Let me thro on a pair of binoculars or pull out a magnifying glass, it could be life changing.

That brings my to my final point here. The Quest 3 has the potential to change lives. It's mixed reality capabilities are very impressive and having better accessibility features could be life altering of many people. Looking at a lit display can overcome some visual limitation by simply beaming more photons into your yes. But now imagine being able to zoom in on things you are working on, change contrasts and invert colors, you can see your measuring tape, read the food container, work your appliance, help your kid with their homework

This all can begin with just getting out of your own way as a developer and letting people interact with your UI elements how they want instead of how you want. But it could grow into something truly amazing. 

I would love to discuss this further and would be happy to provide my own insights and experience. I would also be happy to test or provide feedback for any projects interested in being more accessible or aiming to function as an accessibility aid.

P.S. I've noticed there is no accessibility tag, would it be possible to have one added?

5 Replies

  • steve_40's avatar
    steve_40
    Expert Consultant

    It's a problem with short-sighted Meta design (pun intended). Take the Meta AI Display glasses, for example. The display is on the right eye only and there is no left eye or dual eye version. Maybe they did it that way because about 70% of people have a dominant right eye. So what happens if you are blind or poor-sighted in your right eye?

    • Team.Thunderbear's avatar
      Team.Thunderbear
      Protege

      That is a very good point, I suspect they did it to lower cost or for feasibility reasons but your hypothesis about the side is likely correct. It alienates a portion of the population from being able to use their device and also remove the revenue they would have generated from them. But perhaps the computational power and/or desired battery life to drive two screens wasn't viable in that form factor.

       

       

       

    • steve_40's avatar
      steve_40
      Expert Consultant

      Btw, Meta has assigned your feedback ideas to "Investigating" status (sorry, the image might be hard to read):

      • Team.Thunderbear's avatar
        Team.Thunderbear
        Protege

        Thank you for the update Steve, that seems like a good development. i would be happy to provide them with any sort of input they would be interested in.

→ Find helpful resources to begin your development journey in Getting Started

→ Get the latest information about HorizonOS development in News & Announcements.

→ Access Start program mentor videos and share knowledge, tutorials, and videos in Community Resources.

→ Get support or provide help in Questions & Discussions.

→ Show off your work in What I’m Building to get feedback and find playtesters.

→ Looking for documentation?  Developer Docs

→ Looking for account support?  Support Center

→ Looking for the previous forum?  Forum Archive

→ Looking to join the Start program? Apply here.

 

Recent Discussions