Fulby
9 years agoHeroic Explorer
Automatically set CPU/GPU levels
I'm working in Unity and I've created a script which monitors the framerate and increases or decreases the CPU level, GPU level and render target scale to maintain 60fps. It works reasonably well and better than hardcoding the levels as I don't have all the phones to test with (and that wouldn't be future proof anyway), but there are a couple of issues:
It can only tell the levels are too low when it sees a 30fps frame, so will drop from the correct performance state to a lower one, see a slow frame, and jump back up. I throttle the up/down moves so this doesn't happen too often but I'd like to be able to tell when there's little slack so it doesn't drop in the first place. Is there any way to determine this? I thought something like the amount of time spent in GfxWaitForPresent would be useful but don't know if it's possible to retrieve this.
It increases CPU level then GPU level as I think the bottleneck in my game is draw calls. It would be much better if it could track the CPU and GPU usage independently and so adjust each level (plus render target scale for GPU). Is there anything which would allow this?
It can only tell the levels are too low when it sees a 30fps frame, so will drop from the correct performance state to a lower one, see a slow frame, and jump back up. I throttle the up/down moves so this doesn't happen too often but I'd like to be able to tell when there's little slack so it doesn't drop in the first place. Is there any way to determine this? I thought something like the amount of time spent in GfxWaitForPresent would be useful but don't know if it's possible to retrieve this.
It increases CPU level then GPU level as I think the bottleneck in my game is draw calls. It would be much better if it could track the CPU and GPU usage independently and so adjust each level (plus render target scale for GPU). Is there anything which would allow this?