Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
d_'s avatar
d_
Honored Guest
12 years ago

Interlacing for 150Hz on DK2-class hardware

It strikes me that if my game can pump out 1920x1080 at 75Hz on a realistic system it wouldn't be impossible (understanding that the performance math isn't linear — I know that in my engine I could get there) to run the same game at 960x1080 at 150Hz. Could two independent panels, each driven at 75Hz, (or even one panel with two independently timed/addressable halves, though this would obviously require considerably specialized panel hardware) be presented to the system as a single 150Hz display with (in hardware) each half refreshed half a cycle out of phase so as to fake an effective frequency of 150Hz? One of the most obvious (non-technical) problems that comes to mind here would be the potential for VR sickness, but I'd be curious to find out if there is any experimental data on this particular interlacing scheme and if so, what it shows. Not being intimately familiar with the low-level workings of contemporary displays I'm assuming the Oculus engineers have considered and ruled out the independently-addressable-halves-of-a-single-display scheme for technical reasons beyond my comprehension, but a "double" refresh rate mode is something I'd definitely be interested in being able to take advantage of in any future Rift model with independent panels.

2 Replies

  • I don't think it's a good idea to show the eyes content unsynced (3D shutter glasses show, that it works but I wouldn't prefer that).
    For your idea you need to occupy two ports on your graphics card which send out data unsynced, I guess that won't work. Or you send out FullHD at 150Hz and have one half of the frame ignored by the rift - but your HDMI/DVI won't deliver that datarate (unless you consider HDMI2).
    Your engine might want to update the game logic and physics also at 150Hz to present different content (at a minimum, update the camera), this would require more than rendering the two eyes based on one 75Hz logic step.
    Some parts of the latency depend on the GPU update rate (caching GPU command, processing them, waiting for VSync, read-out), this would minimise the positive effect of your idea unless the GPU goes 150Hz (again: bandwith).

    tl;dr: Sorry, I don't think this will work well.
  • d_'s avatar
    d_
    Honored Guest
    This is probably worth pointing out explicitly, but I made a baseline assumption that 150Hz would be supported by the intermediate hardware and interfaces by the time something like this would be shipping.