New docs for upcoming SDK, comment with questions for clarifications:
This SDK release includes a new TimeWarp parameter "MinimumVsyncs". This defaults to 1 for a 60 fps target. Setting it to 2 will cause WarpSwap to hold the application framerate to no more than 30 fps. The asynchronous TimeWarp thread will continue to render new frames with updated head tracking at 60 fps, but the application will only have an opportunity to generate 30 new stereo pairs of eye buffers a second. You can set higher values for experimental purposes, but the only sane values for shipping apps are 1 and 2.
You can experiment with these values by pressing right-trigger + dpad-right in VrScene.apk to cycle from1 to 4 MinimumVsyncs.
There are two cases where you might consider explicitly setting this:
If you application can't hold 60 fps most of the time, it might be better to clamp at 30 fps all the time, rather than have the smoothness or behavior of the app change unpredictably for the user. In most cases, we believe that simplifying the experiences to hold 60 fps is the correct decision, but there may be exceptions.
Rendering at 30 application fps will save a significant amount of power, and reduce the thermal load on the device. Some applications may be able to hit 60 fps, but run into thermal problems quickly, which can have catastrophic performance implications; it may be necessary to target 30 fps if you want to be able to play for more than a few minutes at a time. We are also experimenting with allowing the native apps to run at 30 fps when users select a "power saving mode" flag in the settings app.
Consequences of not rendering at 60 fps:
These apply whether you have explicitly set MinimumVsyncs, or your app is just going that slow by itself.
If the viewpoint is far away from all geometry, nothing is animating, and the rate of head rotation is low, there will be no visual difference. When any of these conditions are not present, there will be greater or lesser artifacts to balance.
If the head rotation rate is high, black at the edges of the screen will be visibly pulled in by a variable amount depending on how long it has been since an eye buffer was submitted. This still happens at 60 fps, but because the total time is small and constant from frame to frame, it is almost impossible to notice. At lower frame rates, you can see it snapping at the edges of the screen.
There are two mitigations for this:
Instead of using either "now" or the time when the frame will start being displayed as the point where the head tracking model is queried, use a time that is at the midpoint of all the frames that the eye buffers will be shown on. This distributes the "unrendered area" on both sides of the screen, rather than piling up on one.
Coupled with that, increasing the field of view used for the eye buffers gives it more cushion off the edges to pull from. I am currently adding 10 degree to the fov when the framerate is below 60. If the resolution of the eye buffers is not increased, this effectively lowers the resolution in the center of the screen. There may be value in scaling the fov dynamically based on the head rotation rates, but you would still see an initial pop at the edges, and changing the fov continuously results in more visible edge artifacts when mostly stable.
TimeWarp currently makes no attempt to compensate for changes in position, only attitude. We don't have real position tracking in mobile yet, but we do use a head / neck model that provides some eye movement based on rotation, and games that allow the user to navigate around explicitly move the eye origin. These values will not change at all between eye updates, so at 30 eye fps, TimeWarp would be smoothly updating attitude each frame, but movement would only change every other frame.
Walking straight ahead with nothing really close by works rather better than might be expected, but side stepping next to a wall makes it fairly obvious. Even just moving your head when very close to objects makes the effect visible.
There is no magic solution for this. We do not have the performance headroom on mobile to have TimeWarp do a depth buffer informed reprojection, and there are all new visual artifacts when doing that in any case. There is a simplified approach that we may adopt that treats the entire scene as a single depth, but work on it is not currently scheduled.
It is safe to say that if your application has a significant graphical element nearly stuck to the view, like an FPS weapon, that it is not a candidate for 30 fps.
Turning your viewpoint with the joypad is among the most nauseating things you can do in VR, but some games still require it. When handled entirely by the app this winds up being like a position change, so a low framerate app would have smooth "rotation" when the user's head was moving, but chunky rotation when they use the joypad. To address this, TimeWarp has an "ExternalVelocity" matrix parameter that can allow joypad yaw to be smoothly extrapolated on every rendered frame. We do not yet have a Unity interface for this.
In-world animation will be noticeably chunkier at lower frame rates, but in-place doesn't wind up being very distracting. What is more problematic is objects on trajectories, because when you track them with your head, they will appear to be stuttering back and forth as they move.
For many apps, monoscopic rendering may still be a better experience than 30 fps rendering. The savings is not as large, but it is a clear tradeoff without as many variables.
If a decision to go below 60 fps is made, Unity apps may wind up being better off without the multithreaded renderer, which adds a frame of latency. 30 fps with gpu pipeline and multithreaded renderer is getting to be a lot of latency, and while TimeWarp will remove all of it for attitude, position changes including the head model, will feel very lagged.
Note that this is all bleeding edge, and some of this guidance is speculative.
I realized tonight that I may need to lock my app to 30 FPS as I'm still having trouble dialing in the right clock speeds to get a stable 60 FPS without overheating.
I gave 30 FPS a try expecting the worst and found that it wasn't so bad, just took a little getting used to. Definitely could feel a bit of latency, so I appreciate the tip to turn off multi-threaded rendering.
I'm considering the idea of switching from 1 to 2 minimumvsyncs at runtime, perhaps once I've detected that performance has started to suffer. Do you see any issues with this? I haven't looked into whether multi-threaded rendering can be shut down at runtime to coincide with this, but I doubt it.
And finally, do you have any word on whether the Note 4 will have different thermal characteristics than the 1440p S5 that we are working with?
Thank you!
Titans of Space PLUS for Quest is now available on DrashVR.com
Okay so... how can we actually use this feature to set a forced 60 FPS in our build? Been having some issues trying to get ours to integrate as VRTestApp so we've just been building the apk straight from Unity and it's running really nicely, usually around 100-200 FPS, with occasional spikes up to 60. We'd like to be able to lock it to a constant 60 FPS. Is that something doable with this feature and if so what do we have to change and where? Thanks.