cancel
Showing results for 
Search instead for 
Did you mean: 

Automatically set CPU/GPU levels

Fulby
Heroic Explorer
I'm working in Unity and I've created a script which monitors the framerate and increases or decreases the CPU level, GPU level and render target scale to maintain 60fps. It works reasonably well and better than hardcoding the levels as I don't have all the phones to test with (and that wouldn't be future proof anyway), but there are a couple of issues:

It can only tell the levels are too low when it sees a 30fps frame, so will drop from the correct performance state to a lower one, see a slow frame, and jump back up. I throttle the up/down moves so this doesn't happen too often but I'd like to be able to tell when there's little slack so it doesn't drop in the first place. Is there any way to determine this? I thought something like the amount of time spent in GfxWaitForPresent would be useful but don't know if it's possible to retrieve this.

It increases CPU level then GPU level as I think the bottleneck in my game is draw calls. It would be much better if it could track the CPU and GPU usage independently and so adjust each level (plus render target scale for GPU). Is there anything which would allow this?
4 REPLIES 4

Mitnainartinari
Protege
I'm not sure if these are of any use, but if you haven't looked at them yet it might be useful. There are two functions, vrapi_GetSystemStatusInt and vrapi_GetSystemStatusFloat in VrApi/Include/VrApi.h that seem to provide some metrics. In VrApi_Types.h, it lists the following flags you can query, which includes some timing info:

typedef enum
{
VRAPI_SYS_STATUS_DOCKED, // Device is docked.
VRAPI_SYS_STATUS_MOUNTED, // Device is mounted.
VRAPI_SYS_STATUS_THROTTLED, // Device is in powersave mode.
VRAPI_SYS_STATUS_THROTTLED2, // Device is in extreme powersave mode.
VRAPI_SYS_STATUS_THROTTLED_WARNING_LEVEL, // Powersave mode warning required.

VRAPI_SYS_STATUS_RENDER_LATENCY_MILLISECONDS, // Average time between render tracking sample and scanout.
VRAPI_SYS_STATUS_TIMEWARP_LATENCY_MILLISECONDS, // Average time between timewarp tracking sample and scanout.
VRAPI_SYS_STATUS_SCANOUT_LATENCY_MILLISECONDS, // Average time between Vsync and scanout.
VRAPI_SYS_STATUS_APP_FRAMES_PER_SECOND, // Number of frames per second delivered through vrapi_SubmitFrame.
VRAPI_SYS_STATUS_SCREEN_TEARS_PER_SECOND, // Number of screen tears per second (per eye).
VRAPI_SYS_STATUS_EARLY_FRAMES_PER_SECOND, // Number of frames per second delivered a whole display refresh early.
VRAPI_SYS_STATUS_STALE_FRAMES_PER_SECOND, // Number of frames per second delivered late.

VRAPI_SYS_STATUS_FRONT_BUFFER_PROTECTED = 128, // True if the front buffer is allocated in TrustZone memory.
VRAPI_SYS_STATUS_FRONT_BUFFER_565, // True if the front buffer is 16-bit 5:6:5
VRAPI_SYS_STATUS_FRONT_BUFFER_SRGB, // True if the front buffer uses the sRGB color space.

} ovrSystemStatus;




Fulby
Heroic Explorer
Thanks @Mitnainartinarian - I'm not sure if I can access these from Unity but will have a look.

TrevJT
Protege
@Fulby 

If you are able to access those values through unity could you please share how?

I have the same issue, moving CPU, GPU, quality settings, and rendersettings based on real time or average frame rate over a period of time.

My current work around is based on how many enemies(in my case) are in the scene and doing a lot of trial/error.

Fulby
Heroic Explorer
The closest I can see to that enum is OVRPlugin.Status, but it doesn't match all the enum values.

Here's a link to the script.
https://drive.google.com/file/d/0B0y0L4F0HA-JVGlRNDlBU2FwV0k/view?usp=sharing
I apologise for the code quality - it's a work in progress and I'm still experimenting with it. It should probably work on a rolling average.

I removed the rescaling of the render target as it caused a black frame. This may be avoidable by updating the value in the right callback (I used Update).

Dropping the GPU level causes one frame of stutter in my game.