Forum Discussion
bluenote
12 years agoExplorer
Small Rift Example in Scala on GitHub
I eventually found the time to clean up my example code. As a result, I have published a small Rift example project on GitHub. It's written in Scala and uses JOVR. I tried not to use a lot of fancy Scala stuff, but rather trying to make it look a bit like pseudo code (okay, there are still things to clean up on my todo list, but it should do). The actual Rift code is just a few hundred lines of code. So hopefully it is useful for programmers in other languages as well. BTW: You do not even have to have Scala installed to run it. If you just want to run the demo out of curiosity, I have written a small HowTo on the GitHub README.
Unfortunately, I still have one big issue: Judder. I'm currently experimenting with two approaches: Timewarp on + vsync on (running at a steady 75 fps, so it looks like the SDK syncs w.r.t. the correct display), and timewarp off + vsync off (via hmdCaps...). In the latter case, my demo runs at about 500 fps, so it takes about 2 ms to render a frame (which I can verify from the frameTiming.Delta value). But with both approaches I have a very strong judder.
At first I though it was a problem with Linux (although the OculusWorldDemo runs smooth), maybe due to issues with LWJGL buffer flipping. But then I tested the demo under Windows, and it is the same: Both approaches have judder. My next test was to run the JOVR example demos (also under Windows), and to my surprise they also have the exact same judder. Has anyone else this problem with the JOVR examples? Judging from @jherico's Youtube videos, I would say that they are running perfectly smooth...
And what really confuses me: How is it possible to have a strong judder with 500 fps in the first place? The sensor readings are just 2 ms old. Even without prediction this should give a somewhat (laggy but) smooth appearance. So my next guess was that GetEyePoses simply over- or under-predicts (that's also how the judder actually feels -- a constant shivering do to a wrong prediction vs the actual sensor reading). My idea was to replace GetEyePoses by querying the tracking state manually at specific time point. I tried to specify different time points, e.g., thisFrameTime or in the past (to disable any predicting), thisFrameTime + 2 ms, midScanoutTime... But no matter what prediction time point I use, the judder just feels the same. I would even claim that the specified time point might be ignored, since without prediction there should be a feeling of a smooth lag.
So currently I simply cannot explain why this is happening. Any ideas what else could cause this? Am I doing something wrong?
Unfortunately, I still have one big issue: Judder. I'm currently experimenting with two approaches: Timewarp on + vsync on (running at a steady 75 fps, so it looks like the SDK syncs w.r.t. the correct display), and timewarp off + vsync off (via hmdCaps...). In the latter case, my demo runs at about 500 fps, so it takes about 2 ms to render a frame (which I can verify from the frameTiming.Delta value). But with both approaches I have a very strong judder.
At first I though it was a problem with Linux (although the OculusWorldDemo runs smooth), maybe due to issues with LWJGL buffer flipping. But then I tested the demo under Windows, and it is the same: Both approaches have judder. My next test was to run the JOVR example demos (also under Windows), and to my surprise they also have the exact same judder. Has anyone else this problem with the JOVR examples? Judging from @jherico's Youtube videos, I would say that they are running perfectly smooth...
And what really confuses me: How is it possible to have a strong judder with 500 fps in the first place? The sensor readings are just 2 ms old. Even without prediction this should give a somewhat (laggy but) smooth appearance. So my next guess was that GetEyePoses simply over- or under-predicts (that's also how the judder actually feels -- a constant shivering do to a wrong prediction vs the actual sensor reading). My idea was to replace GetEyePoses by querying the tracking state manually at specific time point. I tried to specify different time points, e.g., thisFrameTime or in the past (to disable any predicting), thisFrameTime + 2 ms, midScanoutTime... But no matter what prediction time point I use, the judder just feels the same. I would even claim that the specified time point might be ignored, since without prediction there should be a feeling of a smooth lag.
So currently I simply cannot explain why this is happening. Any ideas what else could cause this? Am I doing something wrong?
2 Replies
- bluenoteExplorerI made a bit more research regarding the judder. In case anyone is interested / willing to help:
One of my guesses in the previous post was that obtaining the tracking state manually would ignore the prediction time point. I can definitely say that this was wrong. I added a small tracking logger to the example, which simply writes the yaw (obtained from the tracking pose quaternion) into a CSV, but w.r.t. to different prediction time points. With the yaw logger enabled, I recorded some smooth yaw movements to see what the prediction is really doing. The following figure shows the yaw for different time points: (1) frameTiming.ThisFrameSeconds - 1 sec (to ensure I really get the latest raw reading), (2) frameTiming.ThisFrameSeconds itself, (3) frameTiming.ThisFrameSeconds + 10ms, (4) frameTiming.ThisFrameSeconds + 50ms and (5) frameTiming.ScanoutMidpointSeconds (which seems to be about +5ms in the future). Some results (right click + open in tab to see the plot legend):
Zoomed in to a 100ms interval:
This actually looks pretty good: The raw readings have a slight noise, but apparently the SDK applies some nice smoothing if one request the current state (as expected in linear prediction). The predicted yaw values make perfect sense and they also follow a very smooth curve; no visible jitter in the predicted values. It is a bit more interesting at the point where the head movement changes:
Here a large prediction of 50 ms is a bit shaky -- but again, this is looking exactly as it should and imho cannot explain judder.
My next guess was that the judder may be caused by too long timeouts for garbage collection. However, this is also unlikely. To analyze this I subtracted the times of two consecutive frameTiming.ThisFrameSeconds to see the distribution of time deltas.
Result with a serial GC (-XX:+UseSerialGC):
Result with a parallel GC (-XX:+UseParallelGC):
The average delta is ~2.4 ms. In the first case the peaks are almost too small to cause judder (even with the next frame the time rarely exceeds the ~13 ms time of a full frame). In the latter case the GC spikes have several seconds in-between them. If it was GC that causes the judder, I should have a completely judder-free experience at least for a few seconds. But this is not the case, the judder is more like a "every frame judder" and not the usual "4 dropped frames per second judder".
Any other ideas for the cause of the judder or other experiences with JOVR would be highly appreciated. - cjwiddExplorer[deleted]
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 12 months ago
- 4 years ago