Forum Discussion
dmacswee
13 years agoHonored Guest
Helping glasses wearers in software?
Firstly please forgive my ignorance, I am a good programmer but have never really had to work with the rendering / shader end. This was an idea that just flashed through my brain after reading some of the other posts in this section.
If the optical distortion is just a matrix multiplication of two in-world camera objects, would it be possible to compensate for common vision problems purely in software?
I am a glasses wearer, and to sum up what a lot of tests said one of my eyes sees further ahead than the other.
Long and short sightedness where both eyes see further or closer are even more common.
I know there will be ophthalmic specialists screaming "there is much more to it than that!" and I'm sure there is.
But imagine rather than trying to squeeze (and scratch) your glasses under your rift, you simply type the figures from your opticians report into a standard profile setting and the software cameras and distortions compensate accordingly.
Often the biggest leaps forward happen when two unrelated fields cross, so is it time for the programmers to talk to the opticians ?
What do you think ?
If the optical distortion is just a matrix multiplication of two in-world camera objects, would it be possible to compensate for common vision problems purely in software?
I am a glasses wearer, and to sum up what a lot of tests said one of my eyes sees further ahead than the other.
Long and short sightedness where both eyes see further or closer are even more common.
I know there will be ophthalmic specialists screaming "there is much more to it than that!" and I'm sure there is.
But imagine rather than trying to squeeze (and scratch) your glasses under your rift, you simply type the figures from your opticians report into a standard profile setting and the software cameras and distortions compensate accordingly.
Often the biggest leaps forward happen when two unrelated fields cross, so is it time for the programmers to talk to the opticians ?
What do you think ?
10 Replies
- geekmasterProtege
- RandyWilliamsHonored GuestI just preordered the oculus minutes ago. Since I'm almost an optometrist myself (finishing, finally!) and I have also "interesting" eye conditions, i'm also quite interested in the matter. As soon as I get the unit and take a look on how it works exactly, I will try to implement "something" to adapt the oculus for myopia, hypermetropia, astigmatism and phorias and other fusion problems, if possible. Who knows, maybe is even possible to use the oculus for rehabilitation purposes! (and that's how we get carried away...)
My main aim with the oculus is quite different, so I won't complain if someone else does it ;)
Josh. - RandyWilliamsHonored GuestAlso, geekmaster, that's a really smart way of solving it. Cool shit!
- geekmasterProtege
"RandyWilliams" wrote:
Also, geekmaster, that's a really smart way of solving it. Cool shit!
The method I posted requires a two-layer LCD, similar to what is used in some 3D LCD computer monitors. The downside is that although the resulting image appears sharp and readable, it is very low contrast. Useful for text, but perhaps not so good for FPS games.
Perhaps more LCD layers could improve image contrast?
It also requires a high pixel density, so may not work well when spreading the pixels over a large FoV as done in the Rift DK. - RandyWilliamsHonored Guestwell, when it comes to pixel density, you don't need such a high count. Note that the method that you posted, plays a lot with interference phenomena, wich makes a higher "virtual" pixel density, if you understand pixel as point. Also, you only need such amount of resolution in the central fov, so you can probably redistribute them optically (i.e. using more actual pixels but proyecting them in a more narrow space, and then doing the opossite in the peripherical fov).
Anyway, looks like the oculus allow better solutions than the one that you posted. The fact that the natural position of the eyes is the relaxed looking foward and parallel (or so they say), implies that they are proyecting optically the rays to form a virtual image in the retina of the eye. If that's true (if anyone has an oculus already, please confirm), you can simply adjust that proyection so that the virtual image lands in other places different than the normal retina, thus compensating the ametropy. Depending on how much of the focus is controlled via software, it may be possible to do what the post suggests. - MSVisionHonored Guest
"RandyWilliams" wrote:
I just preordered the oculus minutes ago. Since I'm almost an optometrist myself (finishing, finally!) and I have also "interesting" eye conditions, i'm also quite interested in the matter. As soon as I get the unit and take a look on how it works exactly, I will try to implement "something" to adapt the oculus for myopia, hypermetropia, astigmatism and phorias and other fusion problems, if possible. Who knows, maybe is even possible to use the oculus for rehabilitation purposes! (and that's how we get carried away...)
Josh or Randy, this is exactly what I was wondering about the Rift and other VR devices!
Please excuse my wording, such technical conversations easily give away I am no native english speaker, but I'll try to explain my train of thought without using (and misusing or confusing) too many technical terms.
I am unable to experience real 3D as my eyeballs are not aligned to both focus on a common sweet spot at some distance. Born crossed eyed, a later surgery corrected that condition at age 5, but not to a degree where the brain could figure out how to fuse the different images into one solid representation of the world on its own. My eyes still diverge, even more so while getting tired. Where a lot of people experience double vision, I developed a dynamic master/slave setup, where one eye takes the leading role and the other just enhances the range of the visible field. Which eye takes the dominant part changes either willfully or simply depending on which eye is suited best for that distance, as they are different diopters.
That used to prevent me from using any stereoscopic image technology like shutter glasses or anaglyph 3D.
Half a year ago I stumbled across "Fixing my gaze" (dot com) by Susan Barry and her TED speech. Against all odds and classic medical opinion she managed to gain 3D vision by training her eyes to fix on a shared focus with the help of Brock strings and other means. The feedback of those exercises helped her brain figure out how to move her eyes towards that common goal and blend the images.
So what I am hoping for is an option to correct the Rift's two virtual cameras vectors to compensate my biological misalignment of the eyes. By presenting independant images for both eyes at an adapted angle I wish to break the barrier of habit, overcome the lazy eye and start seeing the three dimensions I am absolutely certain to move in all my life. Once the sensation of 3D vision is known to my brain, I would love to tweak the angles again, to make them approach the more common setup that is used as a default for everyone else and let my brain figure out how to move the eyes accordingly, so that I might be able to finally see 3D even without the help of VR devices.
On Thursday I approached the Oculus booth staff at GamesCom, skipping the waiting lines just to get to know if such tampering was already possible with the SDK. They were great and answered my questions, so please extend a big thank you on my behalf to all of them.
As it turns out, there is no default configuration for your biological setup in the central driver software for the Rift. There is a bit of tweaking the physical device on your head and the rest is up to the individual application you are running. So it might not be possible to hack into the drivers and fool the application into delivering the non default angles for both eyes, but there certainly is a demand for specialised demo environments to maybe help visual therapy and at least part of the approximatly 5% of the population that experience a condition that prevented them to see 3D as of yet. Maybe start off as simple as implementing that said Brock string experiment.
So Josh (or Randy), if you are going to explore medical applications, I have the personal interest to wish you the best of luck. I hope you succeed and will keep an eye on this topic. - tomfExplorerYou could certainly modify the Tuscany or TinyRoom demos to display these sort of images. Currently we apply a translation to the virtual cameras to model left and right eyes, but if you wished you could also apply a rotation (outwards in your case). It's a fairly simple code change - probably only a line or two of code.
However, this does require access to the source code. It's done in the application itself, or in the SDK code compiled into the application - it is not part of a driver or DLL, so it's not something Oculus can "slide in" underneath an existing app. - tomfExplorer
"dmacswee" wrote:
If the optical distortion is just a matrix multiplication of two in-world camera objects, would it be possible to compensate for common vision problems purely in software?
First, distortion is not just a matrix multiply. It's a radial scaling based on a polynomial. But it's still a fairly simple operation.
But that's distortion - which is just a warping of the image. Your eye does have distortion, but your brain compensates - and has been doing so all your life, so you don't notice it. Distortion is where, essentially, the "pixels" in the image are just moved around. It's not too difficult to do the inverse, which is to move them around in the opposite direction.
The problem with long/short-sightedness is focus. When an image is out of focus, one "pixel" gets smudged into its neighbors. It is a very difficult problem to un-smudge pictures - or, what you really want, to pre-smudge them in a way that when they then get smudged by an out-of-focus eye lens, they come out looking correct. Geekmaster's links above are really cool, but not yet practical for most purposes.
Note that long-sighted people do not need glasses to see the Rift. Everything is focused at infinity, so if you can see mountains without glasses, you can use the Rift without glasses. For short-sighted people we have swappable lenses that may help. However if you have significant astigmatism, then you will still need glasses or contacts, sorry. - MSVisionHonored Guest
"tomf" wrote:
You could certainly modify the Tuscany or TinyRoom demos to display these sort of images. Currently we apply a translation to the virtual cameras to model left and right eyes, but if you wished you could also apply a rotation (outwards in your case). It's a fairly simple code change - probably only a line or two of code.
However, this does require access to the source code. It's done in the application itself, or in the SDK code compiled into the application - it is not part of a driver or DLL, so it's not something Oculus can "slide in" underneath an existing app.
Thanks for your reply, tomf, that confirms the information I got from your booth staff.
My condition would only require some kind of advanced option for X and Y axis for each side (only one is actually needed, but allowing both cameras to be adjusted could be of help, too), but as a typical chicken/egg problem it would help to confirm that this actually enables someone with my kind of visual disability to see stereoscopic 3D in the first place. That would spark the interest in developers allowing for such customizations. I guess any optician/optometrist should be able to provide those aberrations, so once you know your values you could punch them in any application that supports them.
How would you suggest to present those corrections: lat/long notation, degrees, radians, percentual...
Just out of curiosity:
- are there any developers working on a project in the vicinity of Cologne who would let me test drive their creation to see how this works out? I am a technical guy working in IT for about 20 years, so I'm used to NDAs and such, you don't have to be afraid I'd leak info about your baby. But I am certainly not able to develop this scale software on my own or willing to spend money on a device I might not be able to use at all.
- are there any developers working on projects targeted at a vision therapy audience?
Please let me know.
Thanks in advance
Manuel - QosmiusHonored Guestno there are no software that can do this..you need to have a physical object..lens or glasses to make your eyes be able to foxus the light rays on your fundus/fovea to get a perfect image
also long sighted ppls should have a Lens that is stronger than normally because it is very tiresome for your eyes to accomodate for hours when looking at near objects
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 12 months ago
- 4 years ago
- 4 months ago