Forum Discussion
danknugz
11 years agoSuperstar
VR performance disparity between devs
Some games/demos seem to implement head/positional tracking better than others; that is to say, some suffer from great amounts of judder while others don't, with respect to how the non-VR version of the game performs. A good example is Live for Speed vs. Euro Truck Simulator 2. Both run fine in non-VR mode on my laptop, and Live for Speed runs great with no judder or lag in the headtracking or positional tracking. However ETS2 runs horribly in VR-mode, the framerate drops by at least 50% and considerable judder.
I haven't started developing yet so I can't speak much to this technically, but when I do start, this is going to be priority for me. We have also seen that the Cyberspace demo pretty much works for everyone with flawless tracking, and that was made with Unity. So what is it exactly that is causing this disparity? Are demos/games made with Unity easier to implement head tracking, or do they just run better in VR? Is it an AMD/Nvidia thing with issues with the runtime? Or is it solely up to the dev in how they implement timewarp/lookahead algorithms.
I haven't started developing yet so I can't speak much to this technically, but when I do start, this is going to be priority for me. We have also seen that the Cyberspace demo pretty much works for everyone with flawless tracking, and that was made with Unity. So what is it exactly that is causing this disparity? Are demos/games made with Unity easier to implement head tracking, or do they just run better in VR? Is it an AMD/Nvidia thing with issues with the runtime? Or is it solely up to the dev in how they implement timewarp/lookahead algorithms.
13 Replies
- DiConHonored GuestBy now I am convinced, that 90% of all reported "tracking judder" problems are caused by bad perfomance. The demo I am currently working on made me realize that the experience is instantly really bad as soon as you get slightly below 75fps.
The reason for this is vsync. For 75fps your PC has a little less than 13ms to render the frame. If it takes just a little bit longer (let's say equivalent to 73fps), it will miss the next vsync frame and has to wait for the next one. But as it will not start rendering again until this happens (unless you use triple buffering), this will happen on every single frame. Therefore it will only render for every second vsync, which leads to a 50% fps drop which makes VR almost unbearable.
Then why is everybody complaining about tracking instead of low fps? Because you will notice 37.5fps mostly when you turn your head. If just objects in your view or even a coaster track runs at a low framerate, you will hardly notice it - that's why everybody is content with this on a regular screen. But if you move/rotate your head and the world does not respond immediately, you will notice it at once.
I still think that vsync (and timewarp) is essential for good VR, but unfortunately it also leads to an extreme situation where you may not even miss the 75fps requirement a little - it is 75fps or 37.5fps - never inbetween. The typical user will not be able to understand the problem and react to it. At least not until he gets some experience and can actually tell performance-related jutter from actual wrong interpretation of sensor data.
A nice solution without any user requirements would be a render engine, which can finish a frame early - but for most render pipelines this is really hard to do. Another option might be an adaptive detail level, so the engine can decide on its own to trade off some eye candy for performance. - rupyHonored GuestThis game runs at 200 FPS minimum. Try it before you praise Vsync. Actually it never hit me to enable Vsync, vill do later and see if you can enable it for say 300 FPS!?
|
V - jhericoAdventurer
"DiCon" wrote:
By now I am convinced, that 90% of all reported "tracking judder" problems are caused by bad perfomance. The demo I am currently working on made me realize that the experience is instantly really bad as soon as you get slightly below 75fps.
I get terrible judder on a simple OpenGL scene containing a skybox, a floor, and a single cube. The judder is tied to the disparity between the refresh rates of the primary monitor and the Rift. It is NOT tied to the FPS of the application. I can reliably make the judder go away if I do any one of the following- Force my primary monitor to run at 75 Hz
- Make the rift my primary monitor
- Make the rift run at 60 Hz, the same as the normal refresh of my primary monitor.
If it were a performance related issue, the first two changes wouldn't have any impact. - danknugzSuperstarDiCon, everything you say makes complete sense, and I agree that it is important to differentiate between the coding implementation of VR and simply not hitting 75fps. However what about those people you have heard who are running 780Ti, who still get judder issues on certain games, when other games work fine. In these cases I can't help but think they aren't hitting 75fps, but rather that the performance is a driver issue with their particular chipset, or the developer implementation of VR (I say this with naivete as I haven't developed a demo yet).
Edit: Jherico, I also have heard this piece of the puzzle from others as well, and forgot to mention it. It gets more complicated as you account for those with laptops whose built-in screens may or may not go above 60Hz, different monitors, how those monitors are connected (VGA/HDMi/DVI), etc. But this sounds like it could all possibly be fixed with a driver/runtime update? Personally for me, I can get 75Hz on my laptop when running the Rift as the primary, but only get smooth, non-judder in LFS, Cyberspace, and HL2. The fact that it only runs smooth for these games and not others like ETS2 leads me to believe the issue is elsewhere (possibly in optimization of the VR implementation, which leads me back to my original question I guess).
I also have an AMD card, so you have to consider the issue of AMD vs Nvidia, where AMD seems to be more susceptible to quirky behaviors in VR. - jhericoAdventurer
"danknugz" wrote:
Edit: Jherico, I also have heard this piece of the puzzle from others as well, and forgot to mention it. It gets more complicated as you account for those with laptops whose built-in screens may or may not go above 60Hz, different monitors, how those monitors are connected (VGA/HDMi/DVI), etc. But this sounds like it could all possibly be fixed with a driver/runtime update?
It's hard to say for certain without understanding the root cause of it. However, the fact that making the Rift the primary screen can solve the issue (at least on my setup) makes me hopeful that there is a purely software solution. However, this will almost certainly have to come from nVidia. - DiConHonored Guestooops. Looks like I entered a minefield here :)
I hereby officially reduce my previous estimate from 90% to 50% ;)"rupy" wrote:
This game runs at 200 FPS minimum. Try it before you praise Vsync. Actually it never hit me to enable Vsync, vill do later and see if you can enable it for say 300 FPS!?
|
V
The problem is not with games that achieve a high framerate - If you get more than 75fps and enable vsync, the framerate should drop to exactly 75fps and everything is fine. If it drops below 75fps, there is something wrong with the game as this would suggest, that the render time for each frame varies. If you have 200fps, that is nice - then it should not matter at all if you enable vsync or not - at least it should not introduce something that is experienced as judder (it may reduce latency from 5ms to 13ms, though).
I am talking about games which run just below 75fps - here you get an instant 50% fps drop, which makes head tracking really unbearable."jherico" wrote:
"DiCon" wrote:
By now I am convinced, that 90% of all reported "tracking judder" problems are caused by bad perfomance. The demo I am currently working on made me realize that the experience is instantly really bad as soon as you get slightly below 75fps.
I get terrible judder on a simple OpenGL scene containing a skybox, a floor, and a single cube. The judder is tied to the disparity between the refresh rates of the primary monitor and the Rift. It is NOT tied to the FPS of the application. I can reliably make the judder go away if I do any one of the following- Force my primary monitor to run at 75 Hz
- Make the rift my primary monitor
- Make the rift run at 60 Hz, the same as the normal refresh of my primary monitor.
If it were a performance related issue, the first two changes wouldn't have any impact.
ok, I agree, that this is bad. But am I right, that this does not apply to direct to rift mode? (if it does, please tell me how you made it work in OpenGL - Waiting desperately for a fix on this)
I see a lot of posts complaining about head tracking in direct to rift as well and I have to admit, that at first I thought myself that it was the drivers - just as everybody was saying... until I noticed that all of my issues were performance related.
But I agree, that my inital claim that most problems are performance problems was a little... extreme... - jhericoAdventurer
"DiCon" wrote:
ok, I agree, that this is bad. But am I right, that this does not apply to direct to rift mode? (if it does, please tell me how you made it work in OpenGL - Waiting desperately for a fix on this)
I am completely unable to get any OpenGL rendering working in Direct HMD mode, so I have no idea how the judder issues behave in that mode. - FredzExplorer
"DiCon" wrote:
I am talking about games which run just below 75fps - here you get an instant 50% fps drop, which makes head tracking really unbearable. [...] ok, I agree, that this is bad. But am I right, that this does not apply to direct to rift mode?
I think you pretty much nailed it for Direct mode.
In the Unite 2014 - Getting the Most Out of the Oculus Rift video, developers explained that the current SDK needed v-sync for prediction and that it uses synchronous time warp which leads to judder when missing v-sync.
Slide :
They hinted at asynchronous time warp in their last slides and said to stay tuned, which I guess should be able to get rid of the problem. I guess the recent tweet by someone from Oculus recently talking about performance breakthrough was about this as well. - ItsinthemindExpert ProtegeThanks for the info, which would explain a few things. I am comparing my app (download: http://www.lightandmagic.co.uk ) with the Tuscany app and there is no comparison. Tuscany runs smoothly at 75 fps, mine only runs reasonable smoothly in Extended mode at 60 fps whereas in direct mode, although hitting 75 fps it judders and the frame rate fluctuates more.
I have reduced vertices as much as the project allows and applied occlusion etc. Of course Tuscany is a relative simple scene, but I am wondering whether I applied everything possible in Unity to make it run as efficiently as possible, i.e. have I overlooked anything that needs to be set in the Project settings etc? Or is it simply waiting it out until Oculus delivers the much anticipated update? - FredzExplorerI think you may have some expensive shaders somewhere, maybe related to shadows which seems quite costly in Unity.
It would be nice if someone was able to pinpoint the culprit in these jittering demos. The author of SightLine: The Chair managed to remove judder in some scenes but not everywhere, no idea what he tried.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 month ago
- 4 years ago