Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
fungus's avatar
fungus
Honored Guest
12 years ago

Reducing latency by re-using eye textures

I've been tasked with showing a huge database on Oculus Rift. It doesn't render at 60fps on any machine I've tried and Oculus will give half the frame rate (or worse) because everything has to be rendered twice.

I was wondering if it's possible to give the illusion of double frame rate by rendering each eye with a bit of a border then presenting that same rendering again (with appropriate remapping) in between the rendering of the left/right eyes.

Like this:

1) Copy eye textures to Rift (as-is).
2) Render new Left eye (into a third texture)
3) Copy eye textures to Rift (with remapping due to movement since step (1)).
4) Render Right eye
5) GOTO 1

nb. The Left eye needs double-buffering

My graphics card can dump an empty view to the oculus display 1000 times a second so steps (1) and (3) are only a tiny fraction of the overall rendering time. Doing it twice per update would hardly make a difference to the overall update rate.

Is it likely to work...?

Has anybody tried this before??

5 Replies

  • stolk's avatar
    stolk
    Honored Guest
    "fungus" wrote:

    ...
    Is it likely to work...?

    Has anybody tried this before??


    Yes, it would be a big help decoupling render fps from tracking+projection fps.

    I think Oculus calls this 'ansynchronous timewarp'.
    It will come soon, in a future SDK version.

    This is in the release notes of the SDK:

    Scene Judder - The whole view jitters as you look around, producing a strobing
    back-and-forth effect. This effect is the result of skipping frames (or Vsync)
    on a low-persistence display, it will usually be noticeable on DK2 when frame rate
    falls below 75 FPS. This is often the result of insufficient GPU performance or
    attempting to render too complex of a scene. Optimizing the engine or scene content
    should help.

    We expect the situation to improve in this area as we introduce asynchronous
    timewarp and other optimizations over the next few months.
    If you experience this on DK2 with multiple monitors attached, please try
    disabling one monitor to see if the problem goes away.


    Bram
  • A typical rendering loop would look something like this:

    ovrHmd_BeginFrame(hmd, ++frameIndex);
    for (int i = 0; i < ovrEye_Count; ++i) {
    ovrEyeType eye = hmd->EyeRenderOrder[i];
    eyePoses[eye] = ovrHmd_GetEyePose(hmd, eye);
    // Render scene to texture
    }
    ovrHmd_EndFrame(hmd, eyePoses, textures);


    If you want to render alternating eyes per frame, you could do something like this:

    ovrHmd_BeginFrame(hmd, ++frameIndex);
    ovrEyeType eye = hmd->EyeRenderOrder[frameIndex % 2];
    eyePoses[eye] = ovrHmd_GetEyePose(hmd, eye);
    // Render scene to texture
    ovrHmd_EndFrame(hmd, eyePoses, textures);
  • "elect" wrote:
    The difference wouldnt cause some sickness?


    It depends on the scene, and on the user. Because of time-warp it's not quite as bad as halving the effective framerate. In fact, if everything in the scene is motionless and sufficently far from you to not show much or any parallax, it's quite effective. Where timewarp falls down is with objects in the scene that are in motion and nearer objects that should show some parallax from moving your head thus revealing previously hidden areas behind them. These artifacts exist even in the current SDK in an application running at 75 Hz, but they're sufficiently small that they're not problematic for most people in most cases.

    Doing one eye render per frame would essentially double the impact of these artifacts (more really, since you're increasing both the spatial and temporal scale of the artifacts by a factor of two). It will probably increase the incidence of sim-sickness for a given sample of people. But this isn't the same as 'X causes sim-sickness'.
  • fungus's avatar
    fungus
    Honored Guest
    "jherico" wrote:
    Where timewarp falls down is with objects in the scene that are in motion and nearer objects that should show some parallax from moving your head thus revealing previously hidden areas behind them. These artifacts exist even in the current SDK in an application running at 75 Hz, but they're sufficiently small that they're not problematic for most people in most cases.


    Yep, the only problem will be with movements with a lot of parallax, which aren't the norm.

    Even so I have a suspicion that doubling the frame rate will be better than perfect rendering in those situations. Your eyes move in tiny steps when things are panning rapidly past them. This means that precise detail is lost even in real life.

    "stolk" wrote:

    I think Oculus calls this 'ansynchronous timewarp'.


    I haven't read it in great detail but John Carmack's "timewarp'" seems to try to preserve parallax. That's a step up from what I'm going to attempt.

    If I manage to get it working I'll post a demo here. Maybe something where you can press shift to enable/disable it.