Forum Discussion
bcoyle
9 years agoAdventurer
Unity and Rift | player position in relation to camera
Hi,
I'm planning to build a 1:1 model of the space I have the Rift setup in, which includes a table. I also want the headset to match the physical space positionally. Is there a way to adjust the player controller so that it lets the headset load in the proper space in relation to where the camera is setup? I think I'm looking for something more like a 'player controller' for the camera. So I can position the camera in the Unity scene to match the real one - and have the headset/player created in relation to that?
I'm planning to build a 1:1 model of the space I have the Rift setup in, which includes a table. I also want the headset to match the physical space positionally. Is there a way to adjust the player controller so that it lets the headset load in the proper space in relation to where the camera is setup? I think I'm looking for something more like a 'player controller' for the camera. So I can position the camera in the Unity scene to match the real one - and have the headset/player created in relation to that?
- Try something like this:using UnityEngine;
using System.Collections;
public class AlignToTracker: MonoBehaviour
{
public OVRPose trackerPose = OVRPose.identity;
void Awake()
{
OVRCameraRig rig = GameObject.FindObjectOfType<OVRCameraRig>();
if (rig != null)
rig.UpdatedAnchors += OnUpdatedAnchors;
}
void OnUpdatedAnchors(OVRCameraRig rig)
{
if (!enabled)
return;
OVRPose pose = rig.trackerAnchor.ToOVRPose(true).Inverse();
pose = trackerPose * pose;
rig.trackingSpace.FromOVRPose(pose, true);
}
}
12 Replies
Replies have been turned off for this discussion
- cyberealityGrand ChampionFirst, this should be in the Developer section (I'll move it). You can visit the Developer section by clicking the drop-down menu on the top-right and choosing Developer. Then click on the right panel to select Unity (or the section that is most relevant).
To answer your question. You can get the position of the sensor using:Vector3 p = OVRManager.tracker.GetPose().position;
This will give the sensor position relative to the headset position at last time of recentering. So if you assume that the recentering was done in a known physical position in your space, you can derive the physical locations of the sensor and player position.
Hope that helps. - bcoyleAdventurerTo assume the recentering was done in a known position, means the headset needs to always be placed in a known position for recentering. right?
- cyberealityGrand ChampionCorrect.
- delphinius81Rising StarDuring a loading screen or other point in time when the screen is black, when you know you'll have the hmd on and be in the right approximate starting location, you can call recenter, get the relative distance, and calibrate your other camera with that information.
- bcoyleAdventurerThis will be for a sitting experience.
Can I tape off a spot on the table where the headset should sit before the software is launched - that'll then help with getting somewhat of a sync between the 3d and real environment?
Would I then place the OVR player controller directly in the spot I want the headset to match? - cyberealityGrand ChampionYou'd want to run the recentering while wearing the headset, as this will set the natural position of the player and the center of the world.
If you are controlling the space (i.e. it's on your machine) you can just do this once by running the sensor calibration and you should be all set (Oculus app -> Settings -> Devices -> Sensor -> Reset Default View in VR).
If this is for a public game, then you can call the recenter function as part of the game setup/tutorial at the start of your game. For example, in Unity by calling:
Hope that helps.InputTracking.Recenter(); - bcoyleAdventurerThanks for your patience cybereality. I'll give it a shot. But I'm still not sure how the headset/camera centering will align the virtual to the physical space.
- delphinius81Rising StarAre you trying to create a room scale application where physically moving around the room keeps things in a close to 1:1 mapping between the real/virtual worlds? Or are you using the gamepad/keyboard to move around the room. If you are doing the former, you don't need the OVRPlayerController, just an OVRCameraRig.
That said, keeping the initial position of the headset consistent at start is one way to do it. You can reliably do your initial calibration from that position. - bcoyleAdventurerSince the Rift isn't supporting room scale at the moment, I'm looking to lock it down to sitting at a desk, and possibly standing, to review geometry. I'll take a look at using the OVRCameraRig - 3d scanning the sitting area with the Rift in a marked off place - and trying to place the prefab in the same spot. Thanks!
- vrdavebOculus StaffOne reference point you have is the sensor pose. If you can get the user to put their sensor at a particular location in the real world, you can adjust OVRCameraRig.trackingSpace such that OVRCameraRig.trackerAnchor is at a corresponding world-space pose in the app. Let me know if you need code for this. Basically you would have to find the offset from the sensor's initial pose to its desired pose and then apply that to the tracking space GameObject.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago
- 9 months ago
- 11 months ago