Forum Discussion
Anonymous
7 years agoFrame drops with %56 performance headroom
I'm rendering a frame using a custom renderer using DX12, which takes around 5 ms in total for both eyes, so it should be quite comfortable to maintain 90 FPS.
Yet, I get these annoying dropped frames every 200 to 400 frames to 87.2 FPS, sometimes 84 FPS, performance HUD tells sometimes the app drops frames, sometimes the compositor.
There are no errors on the init code, nor are there any rendering errors. It seems like a PhaseSync issue, any way to adjust PhaseSync behavior programmatically?
I'm getting more durable/consistent 90 FPS on apps like Blocks which had a perf headroom of %10.
GPU: RX480
OS: Win10 x64
SDK: 1.24
Thank you.
Yet, I get these annoying dropped frames every 200 to 400 frames to 87.2 FPS, sometimes 84 FPS, performance HUD tells sometimes the app drops frames, sometimes the compositor.
while(1)
{
ovr_WaitToBeginFrame();
OVR_CHECK( ovr_GetTextureSwapChainCurrentIndex(gg.Oculus.Session, gg.Oculus.Swapchains[0], &gg.Oculus.SwapchainFrameIndex) );
ovrEyeRenderDesc* EyeRenderDescs = gg.Oculus.EyeRenderDescs;
ovrPosef HmdToEyePose[2] = { EyeRenderDescs[0].HmdToEyePose, EyeRenderDescs[1].HmdToEyePose };
double DisplayMidpointInSeconds = ovr_GetPredictedDisplayTime(Session, 0);
ovrInputState InputState; ovrTrackingState TrackState;
TrackState = ovr_GetTrackingState(Session, DisplayMidpointInSeconds, ovrTrue); OVR_CHECK( ovr_GetInputState(Session, ovrControllerType_Touch, &InputState) );
ovrPosef* EyeRenderPose = gg.Oculus.EyeRenderPose;
ovr_GetEyePoses(Session, 0, ovrTrue, HmdToEyePose, EyeRenderPose, &gg.Oculus.SensorSampleTime);
ovr_BeginFrame();
APP_RenderSceneForBothEyes(); forii(2) { OVR_CHECK( ovr_CommitTextureSwapChain(gg.Oculus.Session, gg.Oculus.Swapchains[ii]) ); }
// Initialize our single full screen Fov layer. ovrLayerEyeFov ld = {}; ld.Header.Type = ovrLayerType_EyeFov; // ld.Header.Flags = ovrLayerFlag_HighQuality;
forii(2) { ld.ColorTexture[ii] = gg.Oculus.Swapchains[ii]; ld.Viewport[ii] = gg.Oculus.EyeRenderViewports[ii]; ld.Fov[ii] = gg.Oculus.HMD.DefaultEyeFov[ii]; ld.RenderPose[ii] = gg.Oculus.EyeRenderPose[ii]; ld.SensorSampleTime = gg.Oculus.SensorSampleTime; }
ovrLayerHeader* Layers = &ld.Header;
APP_Present();
OVR_CHECK( ovr_EndFrame(Session, 0, nullptr, &Layers, 1) );
}There are no errors on the init code, nor are there any rendering errors. It seems like a PhaseSync issue, any way to adjust PhaseSync behavior programmatically?
I'm getting more durable/consistent 90 FPS on apps like Blocks which had a perf headroom of %10.
GPU: RX480
OS: Win10 x64
SDK: 1.24
Thank you.
2 Replies
- AnonymousSorry, the problem seems to be on my end, I'm getting random GPU spikes for the same amount of work.
- Anonymous
onatto said:
Sorry, the problem seems to be on my end, I'm getting random GPU spikes for the same amount of work.
Did you ever find a solution, what causes the spikes despite plenty of headroom?
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 month ago
- 4 years ago