Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
drash's avatar
drash
Heroic Explorer
12 years ago

Camera Texture Scale carries across Unity scenes

I believe I've isolated a performance problem that dogged me for a few days. Turns out it's not a matter of performance, but rather unexpected behavior in the Oculus Unity integration (applies to both 0.4.1 and 0.4.2 as far as I can tell).

I'm finding that if you have a program that consists of multiple Unity scenes (and one loads the other via LoadLevelAsync, for example), it appears that only the Camera Texture Scale value of the OVRCameraController in the first scene is used for the entire program, ignoring whatever is set on the OVRCameraControllers in the other scene(s). FYI, I'm not using DontDestroyOnLoad anywhere.

So, I had Camera Texture Scale set to 1.0 in my loading scene, which then overrode my Camera Texture Scale of 1.5 in the main scene, which shows up as a big FPS boost, but reduced visual quality that I actually didn't really notice until I compared side by side. Only later when I set my loading scene's Camera Texture Scale to 1.5, did I start to see the FPS sink and the visual quality get better. So, the upside is, wow my program looks even better! The downside is, I have 100 less FPS than I thought I did.

After some further thought, I have only seen this happen in Unity 4.5.3f3 and 4.5.4, but it may not have occurred in 4.5.2, and would explain the mysterious performance increase I saw simply moving my project from 4.5.2 to 4.5.3f3.

3 Replies

Replies have been turned off for this discussion
  • Proton's avatar
    Proton
    Honored Guest
    It's probably because of OVRCamera.cs:
    static public RenderTexture[] CameraTexture = new RenderTexture[2];


    Since the CameraTextures are static, when it gets into CreateRenderTexture, it runs into:
    if (CameraTexture[i] != null)
    return;


    So it never re-creates the render textures if you change them. I wanted to do it for render-size & anti-aliasing. So what I do is manually destroy them then re-create them.

                    if (OVRCamera.CameraTexture[0] != null) {
    OVRCamera.CameraTexture[0].Release();
    OVRCamera.CameraTexture[1].Release();
    DestroyImmediate(OVRCamera.CameraTexture[0]);
    DestroyImmediate(OVRCamera.CameraTexture[1]);
    }
    OVRCamera.CameraTexture[0] = null;
    OVRCamera.CameraTexture[1] = null;
    foreach (OVRCamera c in instance.ovrCameras) {
    int eyeId = c.RightEye?1:0;
    c.CreateRenderTexture(eyeId, cameraTextureScale);
    c.camera.targetTexture = OVRCamera.CameraTexture[eyeId];
    }
    ClearBuffers();
    OVR_SetTexture(0, OVRCamera.CameraTexture[0].GetNativeTexturePtr(), instance.ovrCameraControllers[0].ScaleRenderTarget);
    OVR_SetTexture(1, OVRCamera.CameraTexture[1].GetNativeTexturePtr(), instance.ovrCameraControllers[0].ScaleRenderTarget);


    You need to clear them too (Mac shows garbage in the fresh render buffer for a frame)
    static void ClearBuffers() {
    RenderTexture lastRenderTexture = RenderTexture.active;
    RenderTexture.active = OVRCamera.CameraTexture[0];
    GL.Clear(true, true, Color.black);
    RenderTexture.active = OVRCamera.CameraTexture[1];
    GL.Clear(true, true, Color.black);
    RenderTexture.active = lastRenderTexture;
    }
  • drash's avatar
    drash
    Heroic Explorer
    "Proton" wrote:
    Since the CameraTextures are static

    That's a valuable lesson. I didn't realize static variables persisted from one Unity scene to the next. Thank you for that, plus the code snippets -- was just looking into how to turn the camera texture scale into an option. I owe you one!
  • This is super helpful. I've been wondering why camera texture scale didn't have a noticeable effect in certain scenes in our project.

    Thank you both for this info!