Forum Discussion
owenwp
12 years agoExpert Protege
Perfect prediction, Rift improvement idea
I have been using the latency tester while doing some engine integration work. When I tried setting the head tracking prediction to the exact value that the tester was reporting, the experience was kinda magical. The world felt more stationary and permanent as I moved around.
I think there might be great value in incorporating the latency tester into the final Rift. You could put the photo sensor right in the lower corner of the display where it won't interfere with the image, so that every second or so it can take a latency test sample and dial in the right prediction as the framerate changes over time, automatically.
I found that as long as I didn't go over about 50 ms the benefit of rotational accuracy far outweighed the downside of jerky accelerations, and that seems to be easy to achieve even on mediocre hardware. I definitely won't give another demo without first dialing it in using the tester.
I think there might be great value in incorporating the latency tester into the final Rift. You could put the photo sensor right in the lower corner of the display where it won't interfere with the image, so that every second or so it can take a latency test sample and dial in the right prediction as the framerate changes over time, automatically.
I found that as long as I didn't go over about 50 ms the benefit of rotational accuracy far outweighed the downside of jerky accelerations, and that seems to be easy to achieve even on mediocre hardware. I definitely won't give another demo without first dialing it in using the tester.
18 Replies
- Felix12gHonored GuestHuh, on the fly prediction changes to fit the latency. Could be interesting. Really need to take time to pull this out the of box and give it a shot.
If it's that good with the latency identical, wonder if there's not a ballpark setup we could use in the meantime, not as accurate, but something we can calculate continuously? Possibly through software? - owenwpExpert ProtegeWhat you could do is get a single latency tester measurement in your game, minus the measured time taken to render the frame. Then, while the game is running add that number to your frame time for a pretty decent estimate.
It would be more complicated than that with vsync though. - drashHeroic ExplorerThat's a pretty keen observation, owenwp. I hadn't thought about what to do with the prediction value after the latency tester came out.
In the absense of a real-time latency tester built into the Rift, I'm wondering what the right approach is. Say a demo has latency that runs anywhere from 20-25ms. Have you noticed anything in your experimentation that would indicate that it's better to set prediction to 20ms or 25ms... or just split the difference and set it at 22.5ms as an average?
At the very least, when I have time I'm going to experiment with automatically changing the prediction based on whether or not Vsync is on (for Unity, that is), as that's a dramatic difference between Vsync on and off In Unity and maybe prediction should be set appropriately.
Really cool find! - cyberealityGrand ChampionThat's a very interesting idea indeed.
- raidho36ExplorerThat's already been discussed by people like Carmack, some people even implemented prediction down to third derivative (change of acceleration) which gets pretty good results. However, the bottomline is that while it's good software technique to workaround hardware flaws, we should instead make better hardware because this software technique has flaws as well and they do backfire time to time, oh yeah.
Also, hint: the prediction delta factor has nothing to do with total latency, it has to do with frame time. - owenwpExpert ProtegeI found that it's better to be conservative. If you over predict then it feels like the world is moving before you turn your head, which is pretty surreal to say the least.
- tomfExplorerPrediction time is a bunch of things. Let's say you're running a game locked to 60Hz Vsync, you flush the GPU each frame (to avoid bufferng) and you always read the HMD sensor at the start of the next frame. Ideally you should read the HMD sensor as late in the frame as possible, but let's keep it simple for the moment. You need to think about the following things:
#1: it will be 16.7ms until the next flip/present call.
#2: then the FIRST scanline will start to be sent to the display. But the LAST scanline will take roughly another 16.2ms to arrive (according to the HDMI timings on the DK1). So you probably want to predict for the middle of the screen, which will be 8.1ms after the vsync.
#3: but that's when the pixels START to change. It takes them time to fully change to the new values. Pixel switch time depends on the "from" and "to" states, and panel temperature, and all sorts of things. For the DK1, I'd guess an average of about 17ms settle time, but it's very much open to interpretation, and even depends on what visuals you're displaying. But really we care about when it gets to more new frame than old frame, so half that value - 8.5ms
#4: persistence. Now the pixel is what we want it to be, it will stay that way for an entire 16.7ms frame. So we actually want to predict to the middle of this period, so another 8.4ms
Total: 16.7+8.1+8.5+8.4 = 41.7ms! Note that the latency tester will trigger on the first visible change of the pixel, so it includes only a little bit of #3, and none of #4. It's a shame #3 is so dependent on the images being displayed - it means getting really good precision is very difficult.
For more about the effect of things like #4, it's well worth reading Michael Abrash's blog as he goes into considerable detail on these things. - geekmasterProtege
"tomf" wrote:
Prediction time is a bunch of things. ... For more about the effect of things like #4, it's well worth reading Michael Abrash's blog as he goes into considerable detail on these things.
Thanks for the great infodump! :D - stereoHonored Guestthank you for those tips
- RiftingFlotsamExplorerAnother thing that I've mentioned elsewhere in the past, is now relevant here, after the open sourcing of the latency tester has revealed that it uses an rgb colour sensor.
If the consumer kit does include an integrated sensor for realtime latency feedback, the same sensor could be used to calibrate the colour of the screen, compensating for the uneven subpixel deterioration in oleds.
As many will know, blue oled subpixels have significantly lower effective lifespans than their red and green counterparts, and smart use of this sensor could result in perfectly calibrated colour throughout the life of the device.
Ideally you would display a running average of the framebuffer in the sensor area, and design the latency flag signal so as to maintain representative pixel wear in this space.
I am very excited for the image quality and consistency that such a system could provide to both developers and consumers. No more wondering what your work will look like on whatever shitty/poorly configured panel the viewer is using.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 8 months ago
- 11 years ago