I'm fairly new to Oculus' forums, and I hope that I'm posting this in the right place. This is a question about a possible fix to the incorrect perception of distances on the Rift.
Distances in VR look shorter than they actually are. As far as I'm aware, this is an established fact (but maybe it can vary between individuals and headsets). When I use the Rift, things that are only a few meters away can look like they are a little too close. The effect is more striking with distant objects. A similar phenomenon is that things can sometimes look too small (if they're not really close to the user).
This led me to the conclusion that something must be wrong with the geometry of the images that are fed to each eye. In other words, the images that ultimately fall on the retinas are geometrically different from what they would be in real life. (In more technical terms, there is an incorrect mapping of retina positions to directions in the environment.)
I experimented with the interpupillary distance (IPD) slider, and realized that when I set the lenses farther apart, distant objects did look more distant. However, the distance of nearby objects didn't seem right anymore, and the wider IPD setting made me feel queasy. I concluded that with a fairly wide IPD setting, things at medium distance looked right, and with an even wider setting, things far away looked right. But things never looked right overall.
Because of these observations, I started to suspect that field of view (FoV) may be the culprit. For those who may be unfamiliar with field of view, it is the "angular size" of an image - zooming refers to changing the field of view. There are two different FoVs that are relevant to VR, and when there is a discrepancy between them, things look wrong. On the one hand, we have the FoV on the software side - the FoV of the images that are created by the computer and sent to the headset. On the other hand, there is the FoV on the hardware side, the FoV of the images that comes through the lenses. This "hardware FoV" will vary depening on the distance between the eye and the lens. Individual face shape, therefore, influenses the FoV of the image the user sees. Since the "software FoV" is always the same, there is often a slight misalignment between the FoVs that will make your virtual experience geometrically imperfect.
I don't know if the issue of incorrect distance perception in VR is caused solely by this misalignment of fields of view. However, I was thinking that maybe it could help a lot if the user could calibrate the software FoV to match their individual FoV on the hardware side. So the question is, does my reasoning seem correct, and would it be feasible to implent such a feature in a future version of the Rift software?