Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
rsjtaylor's avatar
rsjtaylor
Honored Guest
11 years ago

Async transfers and timewarp

I'm using client distortion rendering in OpenGL and doing some texture streaming at regular, but lower then framerate intervals.

I'd like to just use buffer objects and let it do its asynchronous thing, but for timewarp there's an explicit pipeline stall each frame to make sure the render timings are correct. (via glFlush() glFinish() in the SDK renderer).
This means my texture uploads are forced to complete within a frame, which in turn hobbles frame rate.

So, I see two possible solutions:
1) Use an implicit sync on the eye texture rendering - I think I could do this by doing a glFlush() and then reading back a pixel from the rendered texture into client memory. Hopefully this will ensure the render is queued and then block until the render finishes without affecting by buffer objects, but I'm not 100% sure.
2) Use two contexts with shared objects, then do the texture streaming in one and the rendering in another. This would ensure the explicit sync in one context doesn't affect the streaming in the other (I think)

Has anyone tried either of these approaches? Would you suggest one over the other? The first seems simpler, avoids the overhead of context switching & the platform specific annoyance of sharing between contexts, but it also seems a little dirty.
No RepliesBe the first to reply