Forum Discussion
pjenness
9 years agoRising Star
Center Eye anchor height , not matching profile ??
Hi
Im using the OVR controller basically, with floor height set. Im noticing the OVR manager does not seem to be setting the height value correctly.
For example
I use the oculus setup tool and enter my accurate measured eye height (1.64m) standing.
I then run unity with the OVR controller and the center eye anchor goes to about 1.8-1.9m.
This is with and without using profile data. If I lie in the setup and say 2m, the height in unity goes to 2.3m
Also..using
(OVRManager.profile.eyeHeight) always seems to give me 1.675 no matter what the setup info is. But even so, the anchor doesnt use that value.
Is there a reason why the anchor values in unity are not obeying the setup settings?
And is there a way to override the anchors initial pose on staart, since it is getting it wrong.
I presume in the UpdateAnchor function it is calculating the position incorrectly? Other than just slapping an offset on the top group transform manually.
I feel like the eye anchor should be respecting the profile information?
Is anyone else getting this issue?
Cheers
-P
Im using the OVR controller basically, with floor height set. Im noticing the OVR manager does not seem to be setting the height value correctly.
For example
I use the oculus setup tool and enter my accurate measured eye height (1.64m) standing.
I then run unity with the OVR controller and the center eye anchor goes to about 1.8-1.9m.
This is with and without using profile data. If I lie in the setup and say 2m, the height in unity goes to 2.3m
Also..using
(OVRManager.profile.eyeHeight) always seems to give me 1.675 no matter what the setup info is. But even so, the anchor doesnt use that value.
Is there a reason why the anchor values in unity are not obeying the setup settings?
And is there a way to override the anchors initial pose on staart, since it is getting it wrong.
I presume in the UpdateAnchor function it is calculating the position incorrectly? Other than just slapping an offset on the top group transform manually.
I feel like the eye anchor should be respecting the profile information?
Is anyone else getting this issue?
Cheers
-P
17 Replies
Replies have been turned off for this discussion
- pjennessRising StarJust wondering if anyone else is getting the same. Where the centre anchor sits way higher than the inputed eye height in the profile (I would expect it sit fairly close to a freshly setup height.
If so what are people doing to correct it. Or are you getting the anchor at the correct height (in which case I guess its on my end the problem)
Cheers heaps
-P - vrdavebOculus StaffUnfortunately, the calibrated eye height is not reported to the app. It is only used to compute the offset from the sensor to the floor. If you want the user to feel like the floor is at the correct level, please set OVRManager.instance.trackingOriginType = TrackingOrigin.FloorLevel. If you need to use eye level tracking (e.g. because the user is seated), you will need to come up with your own way of estimating (or asking for) the user's height.
- pjennessRising StarIm just wondering then what unity uses in the OVR camera rig (when in floor level mode) to set the camera height.
Like you mention, intending it to be the correct height in floor
level mode, I'd expect the center anchor to be roughly the calibration
height. but it is always quite out.
It is using the profile data some how (or thru the oculus api) as if I lie and give it a very tall height, the camera is very high in unity as expected. It just isnt accurate.
What is this expected to return OVRManager.profile.eyeHeight? For me it always returns 1.675 no matter calibration height is used in the oculus setup.
Thanks!!
-P - pjennessRising StarHIya again
OK so investigating more, it looks like 1.675 was the latest "average set player height"
https://developer.oculus.com/documentation/game-engines/0.6/concepts/unity-migration/
Midway down the page under Upgrade procedure:If you previously used an OVRPlayerController with Use Player Eye Height checked on its OVRCameraContoller, then you have two options. You may either (1) rely on the new default player eye-height (which has changed from 1.85 to 1.675); or (2) uncheck Use Profile Data on the converted OVRPlayerController and then manually set the height of the OVRCameraRig to 1.85 by setting its y-position. Note that if you decide to go with (1), then this height should be expected to change when profile customization is added with a later release.Im not upgrading, but it shows the hard coded number.
I just dont quite understand then how FloorLevel mode expects the headset to be at the right height, if its default settings are not actually using the player profile.
I stand exactly where I calibrated, and had entered 1.64 as my eye height. But on game launch with a 0,0,0 Rig the camera anchors move to around 1.8-1.9 . So they are not actually doing what you mention they should.
vrdaveb said:
Unfortunately, the calibrated eye height is not reported to the app. It is only used to compute the offset from the sensor to the floor. If you want the user to feel like the floor is at the correct level, please set OVRManager.instance.trackingOriginType = TrackingOrigin.FloorLevel. If you need to use eye level tracking (e.g. because the user is seated), you will need to come up with your own way of estimating (or asking for) the user's height.
I would expect OVRManager.profile.eyeHeight just reads the number from the calibration, and applies that to the initial center cam height (+/- any detected offset by the tracker)
Thanks for the help!
-P - vrdavebOculus Staff> Im just wondering then what unity uses in the OVR camera rig (when in floor level mode) to set the camera height.
OVRCameraRig uses your head pose when the app starts. OVRPlayerController uses a hard-coded average height value because it doesn't have access to the calibrated height (and because the calibrated height may not be correct for the current user).
> What is this expected to return OVRManager.profile.eyeHeight? For me it always returns 1.675 no matter calibration height is used in the oculus setup.
Yes, that's the expected behavior today.
> I just dont quite understand then how FloorLevel mode expects the headset to be at the right height
Floor level tracking means that the virtual floor (y=0) should be very close to the actual floor. If you use floor level tracking and put your OVRCameraRig at 0,0,0, then CenterEyeAnchor.position.y will match the player's actual eye height. - pjennessRising Star
OK. I can understand this, and the reasoning.
vrdaveb said:
> Im just wondering then what unity uses in the OVR camera rig (when in floor level mode) to set the camera height.
OVRCameraRig uses your head pose when the app starts. OVRPlayerController uses a hard-coded average height value because it doesn't have access to the calibrated height (and because the calibrated height may not be correct for the current user).
Hopefully this will eventually be updated to be usd with "useProfileData" enable. As if it is the calibrated player, then that is valid data. It is another player we could disable that flag and use the average defined here.> I just dont quite understand then how FloorLevel mode expects the headset to be at the right height
Floor
level tracking means that the virtual floor (y=0) should be very close
to the actual floor. If you use floor level tracking and put your
OVRCameraRig at 0,0,0, then CenterEyeAnchor.position.y will match the
player's actual eye height.
I think this is the problem. This step is not working.
CenterEyeAnchor.position.y
should be roughly about 1.64m as that is what I measured my eye to be.
And went thru all the calibration with.
Even if it is not reading
that data from the profile, like you say, if the rift system is
calibrated, then the eye position should be detected about ~1.64 (not 1.80-1.90 as it
is doing. 1.9 is taller than my whole head)
Is there a way in unity to visualise the tracker unit.
I wonder if in unity something is not right. If I can position a cube
at the tracker position and then peak under the headset it should be
fairly linedup (as it is in the setup tool)
If it is not lined up, then ther is some issue with the unity setup.
Thanks for your time @vrdaveb
-P - vrdavebOculus StaffTry making a scene with a ground plane at y = 0. Then add a plain Camera at 0,0,0. Add an OVRManager and set the tracking origin to floor level. When you run it, does the ground plane match the floor of your actual room? If not, you may need to re-run sensor setup. The way sensor setup determines the floor height is by subtracting your calibrated height from the height of your head when you press the select button during setup. If you were sitting down or slouching when that happened, it may have concluded that the floor was lower than it actually is.
- pjennessRising Star
vrdaveb said:
Try making a scene with a ground plane at y = 0. Then add a plain Camera at 0,0,0. Add an OVRManager and set the tracking origin to floor level. When you run it, does the ground plane match the floor of your actual room? If not, you may need to re-run sensor setup. The way sensor setup determines the floor height is by subtracting your calibrated height from the height of your head when you press the select button during setup. If you were sitting down or slouching when that happened, it may have concluded that the floor was lower than it actually is.
OK.
Will do.
Can you also do the same, just to make sure it is working as expected. for someone.
So you'd expect to see that if calibrated and I use 164cm in the setup, in unity in a 0,0,0 the centre anchor would sit at approx 1.64. (ie, 1.64 off the floor height)
I will do this shortly and report back.
Cheers
-P - pjennessRising StarOk. @vrdaveb
2 Cv1, 2 PCs , Both of them I have re-calibrated carefully and set as 1.64
Ive tried with fresh new scenes, OVR controllers added.
I added some small spheres at Y=1.64 to indicate my eye height.
When I run the app straight away I get the centreAnchor at about 1.55. Oddly this is now LOWER than what I was getting previously...but happens on both CV1. I am noticably lower than my spheres
If I swap to EyeLevel and manually make the rig Y= 1.64 it looks pretty much perfect. (t least, my eyes are level with objects)
However I understand this is not ideal to use EyeLevel. What is the down side of "eyeLevel" setting as opposed to floor level?
If I could extract the 1.64 info from playerProfile (its truly strange it is not exposed easily, to get the current calibration) that would be great.
Else somehow in my game I have the player stand, manually set to their desired height and sit then record the difference perhaps, and the values and use "EyeLevel" to switch with a toggle.
Cheers
-P - vrdavebOculus Staff> Can you also do the same, just to make sure it is working as expected. for someone.
Sure thing. It worked for me. Here is the scene I used.
> What is the down side of "eyeLevel" setting as opposed to floor level?
If you use eye level, then the virtual floor (y=0) may not line up perfectly with the actual floor. What eye level does is set y=0 to your eyes' height when the camera spawns. Eye level may be the best option for your use case.
> If I could extract the 1.64 info from playerProfile
Partly for privacy reasons, we don't compute or store the player's eye height anywhere after sensor setup. You would have to do this in the app.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 year ago
- 6 months ago
- 2 years ago