Forum Discussion
coljo158
10 years agoExplorer
Stereoscopic 360 rendering. OVR always starting in Mono?
Hi all,
I'm trying to setup spherical panoramas in Unity and have done so monoscopically. Now I wish to try the stereoscopic version which means I need to access each camera. This couldn't be done with Unity's native support (that I could figure out easily) so I thought I'd try the new Oculus Utilities Package and access them via the prefab. That has changed now too I see (they only have one camera in the prefab, even at runtime. So I went into the OVRCameraRig setup and found the leftEyeCamera, and rightEyeCamera and found that their positions are exactly the same at runtime. The eye anchors are being set correctly I believe, but by default it appears each eye camera is being assigned the "centerEyeCamera" (see image attached). I didn't think I would have to assign them to their respective eyeAnchors manually..
Can anyone help with this or at least encountered this problem?
Thanks!
I'm trying to setup spherical panoramas in Unity and have done so monoscopically. Now I wish to try the stereoscopic version which means I need to access each camera. This couldn't be done with Unity's native support (that I could figure out easily) so I thought I'd try the new Oculus Utilities Package and access them via the prefab. That has changed now too I see (they only have one camera in the prefab, even at runtime. So I went into the OVRCameraRig setup and found the leftEyeCamera, and rightEyeCamera and found that their positions are exactly the same at runtime. The eye anchors are being set correctly I believe, but by default it appears each eye camera is being assigned the "centerEyeCamera" (see image attached). I didn't think I would have to assign them to their respective eyeAnchors manually..
Can anyone help with this or at least encountered this problem?
Thanks!
6 Replies
Replies have been turned off for this discussion
- coljo158Explorer
- vrdavebOculus Staff> The eye anchors are being set correctly I believe, but by default it appears each eye camera is being assigned the "centerEyeCamera"
I an VR-enabled Unity app, all Cameras (with RenderTexture=null) have their local position and rotation managed by VR tracking. For scripts, the pose is set to follow the center eye. Unfortunately, this behavior is the same whether you use a single Camera to render stereo or two Cameras to render the left and right eyes separately. It would probably be more intuitive if, when you set Target Eye = Left or Right, the Camera's pose was shifted 3cm to the left or right to match the actual eye pose.
> I'm trying to setup spherical panoramas
This doesn't sound like something you should be using an OVRCameraRig for. If you're rendering 360, you should probably use something like https://www.assetstore.unity3d.com/en/#!/content/38755, https://www.assetstore.unity3d.com/en/#!/content/53264, or https://www.assetstore.unity3d.com/en/#!/content/41297. - coljo158ExplorerThe OVR Camera Rig is simply for use in VR. I have a sphere which I have modified to face inwards, and have mapped a 360 (monoscopic) image to it and it works. I can see it when using Unity's VR support with Oculus and it works. I can even see it when I use the Oculus Utilities based prefab.
Now, I want to use a stereoscopic 360 image (formatted above/below). So for that, we need two sphere's of which we layer them so one sphere will render to the left eye and same for the right. It is at this point, I realized that the Oculus cameras (left and right) seem to just be set to the center camera. What is the point in having VR if we aren't even providing a real 3D image to the HMD? If the IPD is 0, it's monoscopic.
Thanks for those references but they are for generating Virtual 360 panoramas. I have a 360 panorama taken in the real world which is applied in VR (which I wish to add virtual content to). - vrdavebOculus StaffIn that case, I would recommend making a separate sphere for each eye, parenting it to the left or right eye anchor, and adding a second camera on the same GameObject as the one already in OVRCameraRig. Then make each of the cameras target one eye instead of both and set them to filter the opposite sphere using camera.layermask. Finally, write a script that resets the orientation of each sphere to Quaternion.identity each frame so that only the position is tracked.
- prashantabelladExplorer@vrdaveb
Hi,
Using OVRCameraRig and OVRGazePointer I am trying to track gameObjects as hotspots on 360 video. However only those gameObjects/hotspots are visible that are in OVRCameraRig FOV. How can I track gameobjects which are placed 360 video since the OVRCamera FOV is only 180 degrees - vrdavebOculus StaffIt sounds like you are trying to find the points on the video sphere that the user can currently see. You could start out by setting it to a 90 degree cone, centered around the user's viewing direction. To be more exact, you could use Camera.ScreenPointToRay and raycast to find exactly where the corners of the field of view are on the sphere, but that will take more work.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 11 months ago
- 5 months ago
- 2 years ago
- 3 years ago