Forum Discussion
sh0v0r
13 years agoProtege
Horizontal Lense Separation Observations
The #1 thing IMO that affects the quality of the experience for many people is the lense separation and the fact that for the best experience you want a perfect alignment of eye -> lense -> distortion shader centre. There have been some people modifying their kits like this. But without also offsetting the centre of the distortion and or the position of the Quad to ensure everything aligns it's still not going to be perfect.
The Consumer Kit promises a way of adjusting the lense separation but this is also going to require everyone to add support to their games/demo/apps to match the offset. This is something I want to add to Lunar Flight right now but have yet found a way to offset the Quad or a specific formula that accounts for the different needs depending on the separation.
I built a scale model setup in 3DSMax and rendered the possible offset extents in an orthogonal view. It isn't 100% accurate but its a very close approximation based on my measurements. The panel dimensions were based on a 1280x800 screen capture and then creating a plane with the same dimensions, this was then scaled to the correct width of the panel which I determined as 17.8 cms (7 Inches)
The lenses were measured, both height and radiuses and position in the centre of the panel with the default 65mm separation.
For reference, this Wiki page has a US Army IPD sample range which ranges from 52mm to 78mm
At 55mm the lenses would need to accommodate the inner clipping problems.

The default, I'm not 100% sure its 65mm but I keep reading different things, I think the lenses are 65 but the software camera separation is 64 mm.

At 75mm you wouldn't need to shift the screenspace quad coordinate instead you would adjust the centre of the distortion shader to line up with the lense. This is only for illustration purposes It is not adjusted in this image.

The Consumer Kit promises a way of adjusting the lense separation but this is also going to require everyone to add support to their games/demo/apps to match the offset. This is something I want to add to Lunar Flight right now but have yet found a way to offset the Quad or a specific formula that accounts for the different needs depending on the separation.
I built a scale model setup in 3DSMax and rendered the possible offset extents in an orthogonal view. It isn't 100% accurate but its a very close approximation based on my measurements. The panel dimensions were based on a 1280x800 screen capture and then creating a plane with the same dimensions, this was then scaled to the correct width of the panel which I determined as 17.8 cms (7 Inches)
The lenses were measured, both height and radiuses and position in the centre of the panel with the default 65mm separation.
For reference, this Wiki page has a US Army IPD sample range which ranges from 52mm to 78mm
At 55mm the lenses would need to accommodate the inner clipping problems.

The default, I'm not 100% sure its 65mm but I keep reading different things, I think the lenses are 65 but the software camera separation is 64 mm.

At 75mm you wouldn't need to shift the screenspace quad coordinate instead you would adjust the centre of the distortion shader to line up with the lense. This is only for illustration purposes It is not adjusted in this image.

13 Replies
- InscothenExplorerIf the consumer version has adjustable lens separation, it will be in the SDK and user config profile. This will probably be handled automatically in Rift games if that's the case.
If you look into the SDK you will see where lens separation is. LensSeparationDistance = 0.0635f
Right now I don't know if it reads off the controlbox or is defaulted. I just looked real quick. - sh0v0rProtege
"Inscothen" wrote:
If the consumer version has adjustable lens separation, it will be in the SDK and user config profile. This will probably be handled automatically in Rift games if that's the case.
If you look into the SDK you will see where lens separation is. LensSeparationDistance
Right now I don't know if it reads off the controlbox or is defaulted. I just looked real quick.
Cool! there it is in OVRDevice, I'll have to look into it there and hook it up. I'm using the Unity SDK just to be clear.
OVR_GetLensSeparationDistance(ref LensSeparationDistance);
It looks like it retrieves the initial value from the external plugin and OVRCameraController uses it.
// Get the values for both IPD and lens distortion correction shift. We don't normally
// need to set the PhysicalLensOffset once it's been set here.
OVRDevice.CalculatePhysicalLensOffsets(ref LensOffsetLeft, ref LensOffsetRight);
There also appears to be several values that are not being used yet but are initialised, I couldn't find any reference to them.
OVR_GetEyeToScreenDistance(ref EyeToScreenDistance);
OVR_GetEyeOffset (ref LeftEyeOffset, ref RightEyeOffset);
There will no doubt be an official solution in future SDK updates. I'm hoping the headset has something like a dial that is mechanically hooked up to the lense with some chunky durable cogs to set the IPD and everything is calculated off that. - InscothenExplorerEyeToScreenDistance is in the Oculus SDK.
// Distance from the eye to screen surface, in meters.
// Useful for calculating FOV and projection.
float EyeToScreenDistance;
// Query physical eye-to-screen distance in meters, which combines screen-to-lens and
// and lens-to-eye pupil distances. Modifying this value adjusts FOV.
float GetEyeToScreenDistance() const { return HMD.EyeToScreenDistance; }
void SetEyeToScreenDistance(float esd) { HMD.EyeToScreenDistance = esd; DirtyFlag = true; }
here you can see it looks at the different Rifts. The 5.6" prototype, the 7" devkit, and the 1080p prototypeif (Contents & Contents_Distortion)
{
memcpy(hmdInfo->DistortionK, DistortionK, sizeof(float)*4);
}
else
{
if (is7Inch)
{
// 7" screen.
hmdInfo->DistortionK[0] = 1.0f;
hmdInfo->DistortionK[1] = 0.22f;
hmdInfo->DistortionK[2] = 0.24f;
hmdInfo->EyeToScreenDistance = 0.041f;
}
else
{
hmdInfo->DistortionK[0] = 1.0f;
hmdInfo->DistortionK[1] = 0.18f;
hmdInfo->DistortionK[2] = 0.115f;
if (HResolution == 1920)
hmdInfo->EyeToScreenDistance = 0.040f;
else
hmdInfo->EyeToScreenDistance = 0.0387f; - tomfExplorerThe Oculus lenses (like most HMD lenses) produce "collimated" light - this is light where all the rays from a specific pixel on the display are parallel to each other - they seem to come from infinity. What this means is that translating the eye does not move the image. This is a very strange phenomenon, but it's easy to see - just display a fixed image on the screen, then move the HMD up and down by a few millimeters (don't rotate it - just translate it). The image will not move in space! It's even easier to see the effect in the Z axis - if you pull the HMD away from your face a few millimeters and move it closer and further - again, the image does not move! (there's distortion around the edges of the lens that does change - that's a different effect)
So what this means in practice is that you DON'T need to move the physical lenses to match your eye separation. It would be a nice thing to have because the image gets sharper at the very center of the lens, but you don't need to do it to get the right image. Also, you should NOT move the images on the display either. The images on the display, and the middle of the distortion applied to generate them, should always be aligned to the physical position of the lenses.
The only thing that DOES need to move along with the user's eyes are the VIRTUAL CAMERA positions. The distance between those needs to match the user's IPD. Otherwise the user will feel larger or smaller than they really are, and that can cause dizziness. But the user's IPD does not change anything about the distortion, or where on the display the image is placed.
This is very counter-intuitive - collimated light is pretty freaky at times. The documentation in this area needs to be improved - I'm working on it :-) - sh0v0rProtege
"tomf" wrote:
The Oculus lenses (like most HMD lenses) produce "collimated" light - this is light where all the rays from a specific pixel on the display are parallel to each other - they seem to come from infinity. What this means is that translating the eye does not move the image. This is a very strange phenomenon, but it's easy to see - just display a fixed image on the screen, then move the HMD up and down by a few millimeters (don't rotate it - just translate it). The image will not move in space! It's even easier to see the effect in the Z axis - if you pull the HMD away from your face a few millimeters and move it closer and further - again, the image does not move! (there's distortion around the edges of the lens that does change - that's a different effect)
So what this means in practice is that you DON'T need to move the physical lenses to match your eye separation. It would be a nice thing to have because the image gets sharper at the very center of the lens, but you don't need to do it to get the right image. Also, you should NOT move the images on the display either. The images on the display, and the middle of the distortion applied to generate them, should always be aligned to the physical position of the lenses.
The only thing that DOES need to move along with the user's eyes are the VIRTUAL CAMERA positions. The distance between those needs to match the user's IPD. Otherwise the user will feel larger or smaller than they really are, and that can cause dizziness. But the user's IPD does not change anything about the distortion, or where on the display the image is placed.
This is very counter-intuitive - collimated light is pretty freaky at times. The documentation in this area needs to be improved - I'm working on it :-)
Thanks for the reply Tom, It's great to have your input! :)
I'm a little confused though because what you are saying correlates to what I am pointing out, that is the ideal situation is to have everything aligned. Eye, Lense & Distortion Centre. Infact people performing the horizontal mod report having a much clearer sharper image as a result. Maybe they are introducing other artefacts because the distortion doesn't match?
If the lens position isn't offset to match the eye position then won't people with IPDs that fall outside the average, be looking through the edge of the lenses, producing a blurry image? Or does the light bend through the lense curvature & create the 'collimated' light which reaches their pupils the correct way? I have a rough idea of what it is from your description and this wiki. I happen to have a 64mm IPD & I've noticed that if I shift the HMD up or down there is a sweet spot where it is clear but there isn't a lot of room to move here so I imagine people with +/- 5mm difference are not going to have a good experience with it.
I understand that you also want to ensure the Camera positions are set to the same IPD so that the individuals perception of scale is accurate, assuming they have a good real world object reference. I have played with this a bit in my game Lunar Flight and pushing the value in either extreme has a pronounced effect in making the cockpit feel cavernous or cramped. Despite this, it does not affect image quality which is what my main focus of this thread is about.
Ultimately the reason why I started this thread is that I read somewhere that horizontal lense control was planned for the Consumer Kit and I want to add support to my game now by making IPD settings offset the distortion centre and in the case of really low IPD's shift the render targets inward to ensure there is enough viewable area for the lense. I'm making the assumption that the overlap would not be an issue.
This is likely premature because I imagine you guys would add an integrated solution to do this in the SDK but I thought I could get something up and running and at least get some people who are performing the Mod to test it. - jhericoAdventurer
"tomf" wrote:
So what this means in practice is that you DON'T need to move the physical lenses to match your eye separation. It would be a nice thing to have because the image gets sharper at the very center of the lens, but you don't need to do it to get the right image. Also, you should NOT move the images on the display either. The images on the display, and the middle of the distortion applied to generate them, should always be aligned to the physical position of the lenses.
That's fascinating. I didn't pick that up from your earlier post on the topic of collimated light. I'd been wondering about the lack of any code to use the IPD when calculating the inputs to the distortion shader (I'd been assuming that somehow it was being dealt with in the scene generation code by modifying the projection matric), but that makes perfect sense. - bwhillHonored GuestHey, I'm the guy that did the mod and will soon make available an attachment printed per IPD accordingly to solve the lack of lens separation with the Dev Kits (as I'm sure this will no doubt be solved in the final product).
Just to give a feel of what this feels like in practice. The only things I noticed changing was that the image became sharper, this is a result of not looking through the curved edge of the lenses, and there was some warping going, (not that noticeable), in the edge of my vision.
Using the hmd lens separation setting in HL2, made everything settle and feel right, clear image with no warping artifacts. It was difficult to see what was changing, I thought that the whole image together with it's distortion effect shifted accordingly. - tomfExplorer
"sh0v0r" wrote:
Ultimately the reason why I started this thread is that I read somewhere that horizontal lense control was planned for the Consumer Kit and I want to add support to my game now by making IPD settings offset the distortion centre and in the case of really low IPD's shift the render targets inward to ensure there is enough viewable area for the lense. I'm making the assumption that the overlap would not be an issue.
This is likely premature because I imagine you guys would add an integrated solution to do this in the SDK but I thought I could get something up and running and at least get some people who are performing the Mod to test it.
OK, got it. Yes, if people want to build that sort of lens-moving mode themselves, they will need those controls. But note the SDK already has that value in it - "LensSeparation" - change that and everything should Just Work.
The key is to set this to where the lenses are, NOT where the eyes are. This is a mistake we have seen a few times in the past, which is why I'm a bit gun-shy about it. It's very easy for people to give themselves and others bad experiences without really knowing why.
And yes, if/when we make a devkit where the lenses can be moved, the SDK will handle all that stuff for you. There's nothing you really need to do to "be ready" for it, it will all Just Work. - AnonymousJust a little bit IPDs off-topic, I think the moving lenses would be ideal to solve the issue of tunnel/binocular effect.
In my particular case, the effect is very prominent...I can see the lenses black borders even with the A cups really really close to my eyes, to the point it hurts my nose. The result is that I don't have a great Field of View and the immersion suffers.
Little knobs on top of the Rift would be amazing for that (and it's not even extreme position changes, just milimeters to the left or right to fit the center of the eyes, just like the little rod on top of every binocular). - bwhillHonored Guest
"tomf" wrote:
OK, got it. Yes, if people want to build that sort of lens-moving mode themselves, they will need those controls. But note the SDK already has that value in it - "LensSeparation" - change that and everything should Just Work.
The key is to set this to where the lenses are, NOT where the eyes are. This is a mistake we have seen a few times in the past, which is why I'm a bit gun-shy about it. It's very easy for people to give themselves and others bad experiences without really knowing why.
And yes, if/when we make a devkit where the lenses can be moved, the SDK will handle all that stuff for you. There's nothing you really need to do to "be ready" for it, it will all Just Work.
I see that LensSeparation is a property of the Device itself, and not part of a User profile setting. Although that makes sense when the device LensSeparation cannot change (like an unmodded devkit). Not having it as part of the User Profile/Preferences would mean that:- The device itself would need to report it's current lens separation automatically as a user changes it, (surely this would be overkill to build the device in such as way as it can electronically detect the change)?
- A setting would be saved somewhere, in the same way as the user profile setting would be saved and referenced.
Unless you are indeed going for automatic detection, does it not make sense then to have this setting as part of the user profile config, much like the lenses (A,B,C are variable but are properties of the device not a person, however it is their preference). I think that if people are expected to be responsible to add their IPD, lenses, height etc, they should also be trusted to add their lens separation settings.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 months ago
- 13 years ago