Forum Discussion
jherico
11 years agoAdventurer
Asynchronous timewarp example
Edit: Video updated
A few days ago I was browsing the comments on an Ars Technica story on the Rift and I saw this exhcange:
This seems to be a common misconception concerning timewarp: that it's intended to counteract issues with low frame rates. Timewarp can't do this, at least on it's own. Specifically, a rendered frame can't be timewarped to a new location based on the current headset orientation until ovrHmd_EndFrame() is called. If your rendering code is taking too much time to render, it's presumably not pausing in the middle to call ovrHmd_EndFrame() so that the Rift can display a timewarped version of the previous frame.
What's needed is asynchronous distortion in addition to timewarp. With asynchronous rendering you have one thread that does nothing but run the BeginFrame/EndFrame loop, always passing in the most recently rendered per-eye images and the corresponding poses with which they were rendered. On another thread, you run the actual render loop that's producing those images. This way it doesn't matter whether you're in the middle of a bunch of rendering calls when the next vsync comes around... the BeginFrame/EndFrame thread will always update send the most recent valid render to the Rift, with the proper eye pose so that it can be timewarped based on the Rift's current orientation.
Working with OpenGL in multiple threads can be a bit tricky, so I've produced an example of this technique for my book and I wanted to share it with the community.
I've also created a video showing describing the technique and showing the code in action:
A few days ago I was browsing the comments on an Ars Technica story on the Rift and I saw this exhcange:
"zarakon" wrote:"sep332" wrote:
Hopefully they'll fix the software by then to handle framerate drops more gracefully.
Have you seen the Time Warp feature? https://www.youtube.com/watch?v=WvtEXMlQQtI Or did you mean something else?
This seems to be a common misconception concerning timewarp: that it's intended to counteract issues with low frame rates. Timewarp can't do this, at least on it's own. Specifically, a rendered frame can't be timewarped to a new location based on the current headset orientation until ovrHmd_EndFrame() is called. If your rendering code is taking too much time to render, it's presumably not pausing in the middle to call ovrHmd_EndFrame() so that the Rift can display a timewarped version of the previous frame.
What's needed is asynchronous distortion in addition to timewarp. With asynchronous rendering you have one thread that does nothing but run the BeginFrame/EndFrame loop, always passing in the most recently rendered per-eye images and the corresponding poses with which they were rendered. On another thread, you run the actual render loop that's producing those images. This way it doesn't matter whether you're in the middle of a bunch of rendering calls when the next vsync comes around... the BeginFrame/EndFrame thread will always update send the most recent valid render to the Rift, with the proper eye pose so that it can be timewarped based on the Rift's current orientation.
Working with OpenGL in multiple threads can be a bit tricky, so I've produced an example of this technique for my book and I wanted to share it with the community.
I've also created a video showing describing the technique and showing the code in action:
30 Replies
- TomSDHonored GuestDo you think the technique you're describing is the same as what was called "asynchronous timewarp" here and here? Relevant bits:
Cohen: And this is by no means a shift in focus away from PC. One of the great things about the mobile development process is that asynchronous timewarp, for instance, has been implemented on mobile, and that’s going to make its way to the PC SDK as well. So designing within constraints and making a highly-optimized VR experience is something that’s useful, even if you have the most high-end PC in the world. What we’ve learned from the mobile development process will pay dividends for developers everywhere.
Through some miracle (read: John Carmack), Oculus and Samsung have created a VR experience that feels even smoother than the DK2. Latency is incredibly low. I don’t have the greatest grasp of the technology (so hopefully Oculus will start bragging in detail soon), but my understanding is that Gear VR’s advantage comes from a thing called “asynchronous time warp”. This is a process by which the display is updated at 60 frames per second while adjusting the graphics based on head rotation, regardless of the performance of the actual game. In-game animations will still appear to run at the game’s rendering rate, so performance is still a priority, but there’s almost no latency when simply looking around, and a dropped frame won’t cause a nauseating lurch. It’s even possible to target 30fps for some games, letting the time warp keep the experience smooth while saving a ton of battery life. This feature makes a big, big difference.
In any case, something like this does seem like the true answer to enabling a variable framerate without judder, which unfortunately seems sorely needed. It would be great to make judder a thing of the past. Not that I'd expect a subpar framerate to result in as good of an experience as the ideal framerate, but something less disturbing than the current judder effect would be nice. - jhericoAdventurer
"TomSD" wrote:
Do you think the technique you're describing is the same as what was called "asynchronous timewarp" here and here?
Yes, same technique. Oculus has stated a number of places that they want the distortion the be asynchronous, but apparently haven't been able to get it to be stable or to work to their satisfaction. Given the general state of their OpenGL support I'm not very surprised. - jhericoAdventurerbumping for updated video.
- CubicleNinjasProtegeAwesome video! Super helpful in understanding what is coming down the line.
- nuclearExplorerExcellent point, and very well done video. Cheers.
- marksiblyHonored GuestNice work!
Are you developing on Mac, or have you got timewarp going with OpenGL on PC? - jhericoAdventurer
"marksibly" wrote:
Nice work!
Are you developing on Mac, or have you got timewarp going with OpenGL on PC?
It's a PC, though the code should work fine on Mac as well. Haven't tested though. - 2EyeGuyAdventurerI notice you are violating this rule:
"Oculus_Developer_Guide.pdf" wrote:
All of the following calls must be done on the render thread. This is the thread used by the application to create the main rendering device. ovrHmd_BeginFrame (or ovrHmd_BeginFrameTiming and ovrHmd_EndFrame, ovrHmd_GetEyePose, ovrHmd_GetEyeTimewarpMatrices.
That rule is a huge pain in the neck, since the obvious place to put Asynchronous Timewarp is in its own background thread. I shouldn't have to rewrite a whole emulator to make it run in a background thread instead. How much are we allowed to break that rule?
And exactly what counts as "create the main rendering device" in OpenGL? - matusHonored GuestIs async time warp really better? Consider a case where you are rendering at 75hz but assume you are out of phase - you finish rendering 1 ms after the last timewarp call, giving you a constant lag of 13.3+12.3 ms. If you do sync rendering you just get a lag of 13.3.
Now it will be rare for your rendering process to hit exactly 75fps. What happens in all other situations? your average lag for async tw will be 0.5*1000/fps. synced tw will give you either 1000/75 ms (fps>75) or 2.5*1000/75 - 1000/fps if you start dropping frames (32.5<fps<75). When is async tw better? Sovling 0.5*1000/fps<1000/75 and 0.5*1000/fps< 2.5*1000/75 - 1000/fps for fps I get for fps>75: 32.5<fps = always and for case (32.5<fps<75) 45<fps. So async tw is not always better :ugeek:
What we really need is just-in-time rendering + tw. This will perform better than async tw when no frames are dropped. - ZenSoturiHonored Guest
"2EyeGuy" wrote:
I notice you are violating this rule:"Oculus_Developer_Guide.pdf" wrote:
All of the following calls must be done on the render thread. This is the thread used by the application to create the main rendering device. ovrHmd_BeginFrame (or ovrHmd_BeginFrameTiming and ovrHmd_EndFrame, ovrHmd_GetEyePose, ovrHmd_GetEyeTimewarpMatrices.
That rule is a huge pain in the neck, since the obvious place to put Asynchronous Timewarp is in its own background thread. I shouldn't have to rewrite a whole emulator to make it run in a background thread instead. How much are we allowed to break that rule?
And exactly what counts as "create the main rendering device" in OpenGL?
This is driver/opengl limitation. You cant call opengl functions from multiple threads, at least not to same rendering context. Rendering context can be active/current in only on thread at a time ( usually the render thread ). Offcourse it is possible to create multiple context, but that introduces its own problems ( sharing resources etc. ).
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 10 years ago
- 4 years ago
- 8 months ago