01-17-2024 08:31 AM - edited 01-23-2024 06:08 AM
Hello everyone and especially the people who are in charge of the technical development of the Quest 3,
We're developing a non-commercial application for visually impaired people. The goal is to obtain the image representation in the quest 3 as the real image of the visually impaired person. Through this application that person can show and and let his employer experience what his real visual limitations are. The app is almost done but there is one big hurdle we can't cross. We can't figure out how we can create a visual acuity filter for the quest 3, one that is adjustable to the degree of the person in question. We don't have access to camera software or have the possibility to do in post-processing. We desperately need some help and advice. This application has a noble purpose, your help will be a great help to thousands of visually impaired people around the word !!! Many thanks in advance, Bart.🙏
01-18-2024 12:52 AM
Its not possible to simulate visual acuity on quest 3 because you as a dev dont have access to camera data. To simulate a degredation in acuity you need to use a blur filter (e.g. gaussian) and apply this based on the environment the user experience - but any blur filter will not work if it doesnt have access to the camera data. So at the moment its not possible in AR mode. You can still simulate this in VR environment though.
01-18-2024 04:13 AM
Thanks for the imput, Joits. We hope that it will be possible very soon, because this a very important feature for the visually impaired people in orde to be able to show their specific difficulties to others ( in AR mode). Hopefully the Meta developers will join this thread and offer us solutions.
Kind regards,
Bart
01-29-2024 04:54 AM
Still no respons from the Meta Developers part ?