Forum Discussion
spyro
12 years agoExpert Protege
Low persistence with (nearly) 60 fps performance possible?
Hi there,
As we know, low persistencedoes not work with 60 fps. As the screen flashes only once per frame and stays black until the next frames arrives, the flickering would be very annoying at 60 Hz. So we need 90-120 Hz to avoid this. If we just would use the same frame twice, we would just replace motion blur with heavy ghosting (the old frame could be seen twice on different places as we move our head).
So do we really need a higher framerate (and those way much processing power and bandwidth) just to solve this? Maybe there is another solution. Shutter glasses (as they are used for 3D-TVs) refresh side only with 60 Hz in alternating order. As the user sees through both sides at once, the two frequencies 'add' to 120 Hz for the visual system (which seem to be the important point here).
So the basic idea is, that we could just interleave the left and right perspective of TWO CONSECUTIVE frames in a way we get a "fake-120 fps"-Signal to the Rift:
First, the LEFT PERSPECTIVE OF FRAME 1 will be scanned out to the display (the right perspective of this frame will NOT BE RENDERED AT ALL):

After about 1 ms, the display turns black (either the backlight- or OLED-Switch):

After another 7 ms, the RIGHT PERSPECTIVE OF FRAME 2 arrives based on new data (the left perspective of frame 2 will not be rendered either):

After another black phase of 7 ms, the process repeats.
Again, it's important to note that the left and right perspective are not from the same frame but from different points in time, so the right perspective is NEWER than the left one (otherwise we would produce heavy ghosting as well).
Here is simple diagram I made in a few minutes to show the principle:

Here is a short testvideo which could be watched on a Rift: https://dl.dropboxusercontent.com/u/24358204/oculus/interleaving/oculus_interleaved_60Hz.mkv
If you stop the video on any frame you will notice that only one perspective is visible at once. As the Rift-DK1 only supports 60 Hz, you will easily notice heavy flickering, of course. But with a 120 Hz panel (and 120 fps 'perspective alternating rendering') the flicker should disappear.
The Rift uses lenses to unwarp the horizontally compressed images in a way they fill out nearly the whole FOV, so that's basically the same effect as with shutter glasses. Each side for it's own flickers at 60 Hz (with an offset of 8 ms), but they add to 120 Hz because both sides are watched 'over' each other and note side-by-side.
Note: The brightness of the picture will degrade by 50 % (because only half of the photons hit each eye in given time now). But that shouldn't a problem in a closed box directly in front of your eyes so I assume this can be callibrated.
The big advantage is, that you will have to render the same amount of pixels like on 60 fps. So this save a considerable amount of GPU time and memory bandwidth when compared with "progressive" 120 fps rendering. As the CPU has to do all simulation and world processing 120 times/sec. no matter what, it's only a solution on games which are rather GPU- than CPU-limited (which should be the case on high resolutions and frame rates).
What do you think? Could this actually work? Without a DK2 and a low persistence display it's hard to tell but maybe there is a fundamental flaw in this approach?
spyro
As we know, low persistencedoes not work with 60 fps. As the screen flashes only once per frame and stays black until the next frames arrives, the flickering would be very annoying at 60 Hz. So we need 90-120 Hz to avoid this. If we just would use the same frame twice, we would just replace motion blur with heavy ghosting (the old frame could be seen twice on different places as we move our head).
So do we really need a higher framerate (and those way much processing power and bandwidth) just to solve this? Maybe there is another solution. Shutter glasses (as they are used for 3D-TVs) refresh side only with 60 Hz in alternating order. As the user sees through both sides at once, the two frequencies 'add' to 120 Hz for the visual system (which seem to be the important point here).
So the basic idea is, that we could just interleave the left and right perspective of TWO CONSECUTIVE frames in a way we get a "fake-120 fps"-Signal to the Rift:
First, the LEFT PERSPECTIVE OF FRAME 1 will be scanned out to the display (the right perspective of this frame will NOT BE RENDERED AT ALL):

After about 1 ms, the display turns black (either the backlight- or OLED-Switch):

After another 7 ms, the RIGHT PERSPECTIVE OF FRAME 2 arrives based on new data (the left perspective of frame 2 will not be rendered either):

After another black phase of 7 ms, the process repeats.
Again, it's important to note that the left and right perspective are not from the same frame but from different points in time, so the right perspective is NEWER than the left one (otherwise we would produce heavy ghosting as well).
Here is simple diagram I made in a few minutes to show the principle:

Here is a short testvideo which could be watched on a Rift: https://dl.dropboxusercontent.com/u/24358204/oculus/interleaving/oculus_interleaved_60Hz.mkv
If you stop the video on any frame you will notice that only one perspective is visible at once. As the Rift-DK1 only supports 60 Hz, you will easily notice heavy flickering, of course. But with a 120 Hz panel (and 120 fps 'perspective alternating rendering') the flicker should disappear.
The Rift uses lenses to unwarp the horizontally compressed images in a way they fill out nearly the whole FOV, so that's basically the same effect as with shutter glasses. Each side for it's own flickers at 60 Hz (with an offset of 8 ms), but they add to 120 Hz because both sides are watched 'over' each other and note side-by-side.
Note: The brightness of the picture will degrade by 50 % (because only half of the photons hit each eye in given time now). But that shouldn't a problem in a closed box directly in front of your eyes so I assume this can be callibrated.
The big advantage is, that you will have to render the same amount of pixels like on 60 fps. So this save a considerable amount of GPU time and memory bandwidth when compared with "progressive" 120 fps rendering. As the CPU has to do all simulation and world processing 120 times/sec. no matter what, it's only a solution on games which are rather GPU- than CPU-limited (which should be the case on high resolutions and frame rates).
What do you think? Could this actually work? Without a DK2 and a low persistence display it's hard to tell but maybe there is a fundamental flaw in this approach?
spyro
16 Replies
- jhericoAdventurerThe biggest problem I see (and really, in context not that big) is that you're halving the time available to complete render for display. This is obviously offset by only having one eye's worth of content to render, so maybe it doesn't matter. But hypothetically it could limit the freedom to implement rendering tricks that would otherwise improve performance.
For instance, suppose you decide that everything beyond a certain distance is not eligible for stereo separation. This would probably be a common use case for flight/space simulators. The cockpit and aircraft components can all be rendered with the eye separation distance, but anything past a few dozen meters isn't going to have significant parallax. so you could implement your engine to use a more expensive rendering process on everything in the background, knowing that it will take up fully half of your per-frame rendering budget. Then you could implement your cockpit rendering with a less expensive option, so that each eye only takes up a quarter of the budget. With your proposed approach, there's not enough time to fit the 'distant' rendering in anywhere anymore.
Of course the hypothetical system I just described would have to include some fancy footwork to ensure that the projection matrix for the background was set up properly to capture enough background for both eyes, and then only the portion that's visible to each eye would be actually sent to that half of the display. It would also have to cope with finding a good way to merge content that was rendered with and without parallax (if you're sitting on the runway for instance). But I still think it might be a likely course for some renderers. Maybe it would be possible to still do this in the setup you describe, by interleaving portions of the background rendering with the foreground rendering, but even if it's possible, it would be harder to implement. - spyroExpert Protege@Aeschylus: What the...
- PopopinselExpert ProtegeBy the way, it's called low persistence, meaning a frame (or a pixel) only persists (or is lit) for a very short time.
So let's not hope it will also lead to a low perception! :D - spyroExpert Protege
"Popopinsel" wrote:
By the way, it's called low persistence, meaning a frame (or a pixel) only persists (or is lit) for a very short time.
So let's not hope it will also lead to a low perception! :D
*facepalm*
Thank you, you're right, of course (corrected in post).
spyro - MrAdunHonored GuestYou're faking 120 hz, each eye is still running at 60 fps individually. You will see flicker when there's motion and your eye starts tracking. The problem is instead of being symmetrical, you'll have asymmetrical strobing which could be very disorienting.
Plus, the game now has only 7ms to update the game state instead of 16. You're asking it to run at 60 fps, but it only has time to "think" for 7, because the game state can't change between the left and right eyes, otherwise they see different points in time.
It was a good idea, but I don't think it will work :( - jhericoAdventurerThe "best practices" guide just released by Oculus actually explicitly mentions the scheme I described above, where you might render a significant part of the scene only once, and then use it in both eyes.
I wouldn't discount the approach through, it might be a suitable mode for applications that can achieve it. But the problem with making it an option is that you have to have some mechanism of toggling it and controlling it, as well as synchronizing with your rendering software, which would be outside the scope of APIs like OpenGL and Direct3D. - HalopendExplorerNah. The point of low persistence is your eyes only see something for a short bit WHILE ITS CORRECT.
With your setup either one eye is always "lagging" more than the other or you have to render at 120hz, defeating the performance advantage (except that you don't having to render both eyes simultaneously mind you, but it's not like your getting 120hz for free).
It maybe worth testing out in the second scenario... if only to see how well the brain fuses the 2 together, but I doubt it'll be worth it as it actually more demanding than 60hz simultaneous. - spyroExpert Protege
"halopend" wrote:
Nah. The point of low persistence is your eyes only see something for a short bit WHILE ITS CORRECT.
With your setup either one eye is always "lagging" more than the other
Actually that should not be the case, because the two perspectives are NOT from the same point in time as I pointed out several times in my post. Every frame is based on the most current simulation state. It is actually a 120 fps rendering with 120 Hz but in each frame you render only one perspective, essentially 2D. Your brain would have to take the depth information from the last frame.
Of course, real 120 fps are better. That is not the point. There is not more visual information in the signal compared with 60 fps. But this could probably be a way to avoid the flickering when combined with low persistence.
Maybe this doesn't work, but then there must be another reason for that. - jhericoAdventurerAccording to the slides from the talk Michael Abrash gave at steam dev days, the low persistence display should be limited to about 3ms of being on. Because of the short duration, you need a high refresh rate to avoid flicker, which they recommend at at least 95 hz. If you had to do that per-eye it would result in a total 190 hz, which seems... unlikely.
- pixel67ExplorerI think I saw this on one of these threads. Not quite on topic, but a not too distant cousin. :D
http://www.testufo.com/#test=blackframes
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 5 years ago
- 5 months ago