Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
sp82's avatar
sp82
Honored Guest
11 years ago

async timewarp with directx->opengl interop

Hi all,
just a day before big conferences and announcements on new developments in driver space VR optimizations I want ask if someone have tried to implement the asyc timewarp using this extension: https://www.opengl.org/registry/specs/N ... terop2.txt
I'm have only a little experience in opengl, and I don't know anything about directx, but I think it's possible with this extension implement async timewarp for DirectX, just render offscreen in directx share the buffers with opengl and do warp and presentation in opengl in an async fashion.
What do you think?

6 Replies

  • There is an example of Direct3D / OpenGL interop using this extension here. However, I'm uncertain if it would be of any benefit in doing async timewarp. If the rendering code is primarily OpenGL then it may or may not honor any Direct3D pre-emption directives. In fact it would be pretty optimistic to expect it to do so.
  • sp82's avatar
    sp82
    Honored Guest
    Hi Davis,
    I know your code! I have the m.e.a.p. of your book! Very Nice work! Thanks!
    I think your code it's very usefull to render opengl to oculus with direct mode (direct mode it's supported on by directx as we know) or in general to embed opengl content into DirectX applications.

    The issue 2 on this specs: https://www.opengl.org/registry/specs/NV/DX_interop.txt
    do not avoid concurrency between opengl and directx except if you want to render on the same resource, so I think the 2 contexts are indipendent.
    With little work I think it's possible write a non blocking syncronization using a triple buffer architecture where directx render alternatively on 2 buffer and opengl render from the third one, where the most recent rendered frame is, and render to the frontbuffer (ExtendedMode, not DirectMode).
    If all work asyncroniusly it's possible to implement async timewarp maybe with "racing beam capability".
    All of it in a zero-copy fashion!
    I don't know, I'm a system integrator, of boring enterprise systems, and I like solve problems like this but I don't know DirectX, I don't want, and I don't have the skills to implement this stuff but I want badly the async timewarp with multi-layer composition for cockpits games.
    I spend a lot of time in Elite Dangerous and iRacing and ED in particular skip a lot frames randomly even with very good hardware and all the settings lowered.
  • Yes, the two rendering contexts are independent, but they're still sharing a hardware resource, the GPU. The basic problem I've encountered trying to implement async timewarp, and which is mentioned by Antonov in his blog post, is that there's no mechanism for scheduling or prioritizing work on the GPU.

    Whether you're dealing with a Direc3D/OpenGL interop combination, or two OpenGL contexts, the basic problem remains. If the GPU is being saturated by the rendering process, and another CPU thread tells the GPU to perform the distortion on a previously rendered frame, that distortion won't complete in a predictable amount of time, causing you to miss your v-sync interval and resulting in judder. In fact, in practice when I tried to apply this technique to ShadertoyVR, I found that any shader (essentially the entirety of the rendering thread is just 'the shader') that couldn't already run at well over 75 fps caused the distortion thread to only manage to complete it's distortion work after several, sometimes tens of frames had gone by. In theory, async timewarp is supposed to protect you from judder. In practice, without GPU pre-emption or scheduling, it actually makes it 10 times worse.
  • sp82's avatar
    sp82
    Honored Guest
    All this cores and very bad preemption. it's a shame it's not possible dedicate few of this cores to do the time warp.
    I read the Antonov article and the vr direct conference transcription. I'm very disappointed that there isn't a concrete solution to this problem.
    I don't care VR Sli or next gen graphics library, I just want the damn async timewarp now.

    If it is necessary why not use dedicated hardware? so simple! Just put a second cheap gpu on an empty PCI-E, connect the Oculus and check a flag into a control panel: "Dedicate this gpu to do the racing the beam damn async timewarp".
    John talked about to using a cpu core or the integrated gpu to do this stuff but no new news about it, maybe it's not good in terms of latency.
    John please save me.
    Some time ago I wrote about doing timewarp and distortion on the HMD to John and he reply this:
    https://twitter.com/mad_max82/status/464113653466533888
    I hope he can do at least the rotational async timewarp on the HMD in the future.
  • sp82's avatar
    sp82
    Honored Guest
    "Sony did image reprojection (rotational warping) with dedicated hardware: https://www.youtube.com/watch?v=BaCxLZDcNBo" NOT TRUE

    EDIT: Sorry, I misunderstood. The PU do not warp but unwarp the image to show it on the TV. But still I think would be a good idea do the timewarping with dedicated hardware.