Forum Discussion
rje
12 years agoProtege
Any gotchas with a multiple ovrcameracontroller setup?
In my current game I need to have two rendering cameras, one for all the distant scenery (stars, nebulae, etc) and one for the player's view from a spaceship. In a normal game I'd just use two cameras and be done with it, for Oculus should I be using 2 OVRCameraController objects? If so are there any settings I need to be aware of or change to handle having multiple controllers active?
20 Replies
Replies have been turned off for this discussion
- drashHeroic ExplorerWhy not have your stars, nebulae, etc on the same camera as your player's view? You can move the stars/nebulae/etc a long distance away from your camera every update making them seem stationary.
- SiggiGProtegeAt least last time I tried, you cannot have multiple OVRCamera controllers at once, only one of them (first spawned) gets the sensor tracking. It might have changed since then, so something you need to test.
However you can always have the secondary cameras be normal (without the OVR scripts) and control the rotation of it via scripts that poll from the OVR ones (thats how its setup in SpaceUnity if you are using that package). Without the OVR scripts it might be tricky to get the IPD etc on the secondary right and you'll have to do some rejigging to make sure to slave the left background camera to left main one, and right back to right main. - rjeProtegeSo I setup a small test of this last night, my scene currenty looks like:
1st camera: OVRCameraController that only draws my "DeepSpace" layer. This is the stars/nebulae/planets/etc. Basically all the static background stuff.
2nd camera: OVRCameraController that draws everything except the "DeepSpace" layer. This is the player camera that will be moving around. Right now it just renders fog and 'dust' that swirls around as the player flies through space.
I wrote a small script to have camera 1 take the rotation matrix from camera 2, so as the player flies around it's always pointing the right way.
So far this seems to mostly work, but in looking at a screenshot I took I'm seeing a result that I'm not sure how to explain:
The most noticeable issue in the screenshot was the weird shadow/hole around the planet in the fog layer, which I've fixed (2nd camera needed to be set to depth clear).
However, it still seems weird that the 2nd camera draws in a bigger area than the 1st camera. It's almost like the stuff rendered with camera 1 is getting distorted extra or something.
Any ideas on what would cause that?
Thanks again,
Ryan - rjeProtegeTo close out this topic:
Multiple OVRCameraControllers rendering in passes will apply lens distortion multiple times. You can't simply remove or disable the lens correction component because the built in scripts rely on its existence. So the solution is to either:
1. Modify your copy of the SDK so that OVRCameraController and related classes only call lens correction code when the component exists on the object.
2. Create a custom class that does the required bits of the OVR classes (setting the correct distance for the cameras, etc) and just draws to the left and right halves, knowing that the lens correction will eventually be applied by a proper OVRCamera.
I tried #1 as a quick hack and it works, but you'll have to port your changes to future SDK versions. I'm working on #2 and would be happy to share it if anyone else would find it useful.
Finally, if you want to see what kind of difference it makes, here are 2 videos I've created. The first video is before my changes, and you can see how the starfield background doesn't draw all the way to the edge of the player view, but it does in the second video. http://www.youtube.com/playlist?list=PL ... wbrMGXqus9 - cyberealityGrand ChampionI don't have a good answer to this right now, but it's something we can look to improve in the future.
- rjeProtegeI'm in the midst of finishing up a solution (it's a separate prefab and scripts so it doesn't require modifying the sdk at all). The only official change that would be helpful in a future SDK would be exposing values on OVRCameraController for the camera Depth, Culling Mask, and Clear Flags, and then have OVRCameraController set those values on its child Camera objects, much like it does with the clipping plane and background colors.
I'll post an update to this thread tonight or tomorrow with a link to the scripts and a unitypackage. - petergiokarisProtegeThank you for the input rje!
If adding support in OVRCameraController to set camera Depth, Culling Mask, and Clear Flags allows multiple cameras to work properly, then it will be added. Could you send an overview of how these fields would be set in order to allow for the multiple camera scenario? I am unsure what rendering the scene with 2 OVRCameraControllers will allow one to do over using just 1.
I could also add a field to turn off lens warping, in case one doesn't need to use it.
btw there is a field within OVRDevice.cs that can be used to get the scene to render all the way to the edge of the screen:
DistortionFitScale
Currently, it is set at 0.7, which shrinks the size of the rendering screen (DK1, being a 7" screen, does not require full rendering to the edges). If it is set to 1, rendering will go all the way to the edge of the screen.
Let's continue this discussion, I want to better understand the need for 2 OVRCameraControllers rendering at the same time :)
Best,
Peter - rjeProtegeHi Peter,
I got bogged down working on my game -- I'm trying to get it done within the month of April so I don't have a lot of time. :)
Taking notes as I go, will try to post my code and notes here soon. The tl;dr on "why" I want multiple cameras: in my space sim I use one camera to render the distant objects: skybox, nebulae, planets, moons, rings, etc. Then I use a second camera to render the player view on top of that - cockpit, enemies, asteroids, etc. The first camera I mentioned is always fixed at the origin and rotates to match the player's look direction, and the second camera is mounted inside the cockpit and translates/rotates as necessary.
Edit:
I forgot to mention it, but the reason for depth/clear flags/and mask exposure: The stuff in the distance is all on a specific layer that the first camera draws after clearing the screen. The second camera is then set to 'depth only' for clear, and draws all of the layers except for the distant objects layer - petergiokarisProtegeAhh OK, makes sense. So, having those 3 fields exposed, plus turning on / off lens distortion for a given OVRCameraController might be the best overall solution. Keep in mind that camera depth is used by the cameras for camera rendering order; CameraRight has a depth of 0, which is used to sample the rift tracker, while CameraLeft simply uses this value. This syncronizes the rendering for left and right, so overriding their depth may be problematic.
The latest version of the SDK encapsulates the quaternion sampled by CameraRight within OVRCameraController (removing static variables from the codebase to make it more modular), so this may become a bit of an issue; I might have to make this quaternion static again to be shared by multiple cameras for your usage to work.
Have you tried attaching the ship transform to the 'Follw Orientation' field within OVRCameraController and only using one camera? This may also work for you; it's how I would recommend a cockpit that moves the camera relative to it; then you can move the ship around in space with the orientation of the cameras locked in a forward position, while still allowing you to look around the cockpit. - rjeProtegeYeah, I saw the code that was checking for depth == 0, I made sure that the player camera controller uses depths 0 and 1 (for right and left, respectively). I'm fine with making sure that left camera always has a depth of (right camera depth + 1).
The issue with only using one camera is I don't want the objects in the distance to move with respect to the player, it's "distant" in the sense that the player can never reach it. (That's why the distance camera only rotates, the player should never move any closer to any of it). I suppose I could make sure that all the distance objects are updated each frame to maintain a consistent distance from the player, although I'd likely also need to update the shaders so the distant objects no longer write to the zbuffer
The cockpit view camera I believe I am using with follow orientation but I"ll have to double check tonight.
Edit: If you haven't seen it yet here's a video from last week on my project. New video coming in a day or two with ship combat. :D
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago