Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
jasonrohrer's avatar
jasonrohrer
Honored Guest
11 years ago

RoomTiny gets 37.5 FPS with SDK_RENDER

DK2 here with SDK 4.4.

Lots of demos work fine and smooth at 75 FPS, both Unity and native stuff.


Running in Direct Mode.

Compiling WorldDemo (basically a static Tuscany scene), motion is smooth, and the info panel (SPACE to show it) shows a solid 75 FPS.

Compiling RoomTiny, however, and it's smooth for about 8 seconds, just until the cube makes a full revolution, and then it jumps down to what looks like 37.5 FPS with noticeable motion stutter. I've seen 37.5 occasionally pop up in other demos (even the Config Utility's Demo Desk), and restarting the App usually clears it up, but for RoomTiny, it is like clockwork---happens every time after exactly the same amount of time.

Looking at the code, I tried changing SDK_RENDER to 0.... and this fixes the problem.

So... what is it about the SDK distortion rendering that is slowing things down after 8 seconds in this very simple demo?

Any one else experiencing this?

9 Replies

  • If I'm not mistaken, this could be related to the issue where the busy waits were waiting too long to sync with the GPU and thus missing the frame update (which would drop down to half refresh or 37.5). I think we know what has to be fixed, but it's not a simple thing. If compiled games and your own demos are working I would not worry about it.
  • Well... I was looking at RoomTiny as a basis for coding up my own engine. But I'm just getting started, so I want to make sure I get off on the right foot (and not 37.5 fps).

    Is best practice, for now, to avoid "ovrHmd_EndFrame" (SDK-based distortion rendering) and to invoke application-side distortion instead? RoomTiny shows both methods.
  • If you can disable time-warp, I believe that should work around the issue.

    I don't remember the exact code that does that, but it should be in the docs somewhere.
  • Okay, another datapoint here.

    This also happens on the Config Utility's Test Scene, but only on the Oculus VR title screen with the Start button.

    It happens after roughly 10 seconds, like clockwork (the same amount of time that triggers it in RoomTiny).

    HOWEVER, in the Test Scene, the problem clears up after pressing Start and viewing the desk.


    So, anyone know how to turn Timewarp off through the C api? Or, even better, how to fix this without turning timewarp off?
  • I havn't experienced it in my own engine but to turn off timewarp you can do the following...

    unsigned           distortionCaps = ovrDistortionCap_Chromatic | ovrDistortionCap_Vignette;
    distortionCaps |= ovrDistortionCap_Overdrive;
    distortionCaps &= ~ovrDistortionCap_TimeWarp;//<------

    if (!ovrHmd_ConfigureRendering(pOculusVR.pHmd, &pOculusVR.d3d9cfg.Config,
    distortionCaps,
    eyeFov, pOculusVR.EyeRenderDesc))
    {
    //...
  • Oh... well, that's easy enough!

    And yes, confirming that this fixes the problem on my end.

    RoomTiny demo:

    Timewarp on: smooth for exactly 10 seconds, then drops down to 37.5 fps with constant, consistent judder every frame.

    Timewarp off: smooth forever.


    Sadly, with Timewarp off, latency on turning your head is more noticeable.

    I've gotten the same results leaving timewarp on but toggling ovrDistortionCap_ProfileNoTimewarpSpinWaits

    The interesting thing is that if I add a key to toggle this mid-run, I can wait until the 37.5 starts, disable SpinWaits to fix it, and then turn SpinWaits back on, and it stays smooth after that, even past the 10 second mark.


    It also remains unclear why the WorldDemo (tuscany) doesn't have the same problem, but the OculusVR title screen in the Config Util test scene does.

    Hope this helps you guys track this one down.
  • Anonymous's avatar
    Anonymous
    Timewarp works by waiting the last moment before the VSync to query the latest position to deform the frame in the distortion phase. I don't know exactly how the thread doing this do to "awake" himself (hem, busy wait...) in the right time but it's possible that it first measure the time current hardware do the distortion phase and use this data to calculate when he should awake. On very light scenes however, like the tiny room demo, it's possible that the hardware (CPU or graphics card) switch back to low power state due to low utilization, at very reproducible times, and thus missing consistently the VSync.

    Can you monitor all your frequencies during a tiny room's run ?
  • "jasonrohrer" wrote:
    Is best practice, for now, to avoid "ovrHmd_EndFrame" (SDK-based distortion rendering) and to invoke application-side distortion instead? RoomTiny shows both methods.


    In theory, if you're using timewarp and doing client side distortion, you should end up with the same effect. I would suggest adding some logic to make a call to ovrHmd_ResetFrameTiming() if the framerate suddenly drops without an apparent trigger caused by increase rendering load.
  • Investigating, my Nvidia control panel was set to adaptive power management. By turning it to high performance mode, this problem went away in TinyRoom.

    Also fixed the jump to 37.5 fps in Config Util's demo title screen (but only after exiting config util and restarting it---maybe applications remember the adaptive settings they start with).

    Also, ovrHmd_ResetFrameTiming( HMD, 0 ) worked perfectly to fix the stutter in TinyRoom before I turned off adaptive GPU.

    I also confirmed that increasing the rendering load in TinyRoom (by drawing way more stuff) does in fact cause it to NOT trigger adaptive GPU mode.

    You nailed it.

    Thank you for taking the time to respond with such insight!