Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
MartinSchultz's avatar
MartinSchultz
Honored Guest
11 years ago

targetFrameRate not working

In my app I can switch between a camera that renders standard "old school", a camera for stereoskopic and one camera setup for the Rift (using the prefab from the 0.32 SDK). When I set Application.targetFrameRate = 60 it renders at that speed if I use the standard camera or the stereoskopic, but when I use the Rift camera setup, it drops to 30, exact 30 (like it was set with .targetFrameRate). The machine is able to render faster. So I wonder if there is anything in the Rift driver limiting to 30? I haven't found anything using a search. Or is this a driver bug?

I'm on the Mac btw using SDK 0.3.2 preview, haven't yet tested on a Win machine. It's the latest Retina Macbook Pro. Quality settings are set to Fantastic.

Anyone else noticed this?

- Martin

2 Replies

Replies have been turned off for this discussion
  • Are you using vsync? How much is your app utilizing the GPU? You can check in the Unity profiler (Window > Profiler). When vsync is enabled, Unity ignores Application.targetFrameRate. If the renderer is taking too long, it might be locking you into 30fps instead of 60fps.

    When you enable OVRCameraController, it will turn on vsync, even if you've disabled it in your project quality settings. You can prevent this by commenting out the following line:

    // QualitySettings.vSyncCount = 1;
  • Thanks. Yeah, I tried without vsync in the settings, but I overlooked to look for that it might get re-enabled. Thanks for pointing out, I'll give it a try.