cancel
Showing results for 
Search instead for 
Did you mean: 

Jutter must die!

Scofthe7seas
Honored Guest
Ok, I know this has been brought up here and there, but I think isolating and fixing the jutter issue should be a huge priority. We need to figure out exactly what is causing it, and see how it can be worked out for EVERYBODY on all applications. First of all, we need to gather basic information about the issue.
In case you don't know, jutter is a weird leaping of movement when looking around in the rift. I suppose it could also be called stutter.
Does everybody have jutter to one degree or another? Are there some people who have no idea what it is, and have never seen it? If so, what setup are you using?
For me, I have an Nvidia 780GTX and in Intel 2600k i7 at 3.5, 16 gigs of ram. My main monitor is a TV running at 60Hz. I don't experience jutter in everything. The DX11 forced mode fixes many experiences for me (dreadhalls, cyberspace, helix). There are no framerate issues when jutter is present, because framerate shown is typically well well above 75. There is mostly no jutter on applications that go direct to rift.
Some demos have jutter no matter what the setup is. Sightline is one of them, though jutter can be minimized by making the rift the primary monitor (and in Nvidia, it doesn't hurt to set "run at highest available refresh rate in Nvidia control panel.)
Minecrift gives me jutter under every circumstance. I even set the rift as my ONLY monitor and it still happened.
I've heard some people say it is a problem with Timewarp, but if it is, shouldn't it be an issue affecting everybody? And also, if it is, does that mean it is inherent in the SDK? Both Quake rift conversions have timewarp settings, and I have experienced NO jutter in either of those games.
I have no intention of seeming demanding. I just believe as a community we can narrow down the root of this issue by sharing information in one place, instead of it being mentioned here and there in various threads for various demos/applications. Together, we can put a stop to this menace!
62 REPLIES 62

Scofthe7seas
Honored Guest
"Wireline" wrote:
Reading Michael Abrash's blog, he says judder is unavoidable unless refresh rates get insanely high.....low frame rate leads to lower refresh of each pixel, just as low refresh does, hence more time each pixel stays "on". Two different causes, same result

Link to the Abrash blog piece: http://blogs.valvesoftware.com/abrash/why-virtual-isnt-real-to-your-brain-judder/

I think the industry is already working pretty hard on it though 😄


I snipped your quote to shorten my reply length; I hope you don't mind. The problem there though, is that there wasn't any judder on the DK1. Also, the problem only happens if you move your head. If you stay still, and action happens in front of your view, it will go butter smooth. Also, it's not present in every application. I suppose it can be argued that some applications are more intensive than others-therefore more judder. But I don't think this is the case.
Also:
Does anybody responding have judder issues? If so, what is your setup, and what demos/games are giving you the problem?
The problem is varies enough that it can't be blamed on the way the brain works. The closest I could use to describe it is like a kind of vertical V-sync tearing. Which obviously points to a refresh rate problem, but doesn't explain why it only gives you problems when looking around.
I want to see if there are common linking factors involved with the issue. I forgot to mention that I am also using Windows 7 64bit. I did have an issue before a fix came out for the rift software, where my rift wasn't being detected in the software properly before an update was put out. I know many people didn't have this problem. Perhaps an Oculus rep can share a bit of what was fixed to stop that problem?

Wireline
Explorer
"Scofthe7seas" wrote:
"Wireline" wrote:
Reading Michael Abrash's blog, he says judder is unavoidable unless refresh rates get insanely high.....low frame rate leads to lower refresh of each pixel, just as low refresh does, hence more time each pixel stays "on". Two different causes, same result

Link to the Abrash blog piece: http://blogs.valvesoftware.com/abrash/why-virtual-isnt-real-to-your-brain-judder/

I think the industry is already working pretty hard on it though 😄


I snipped your quote to shorten my reply length; I hope you don't mind. The problem there though, is that there wasn't any judder on the DK1. Also, the problem only happens if you move your head. If you stay still, and action happens in front of your view, it will go butter smooth. Also, it's not present in every application. I suppose it can be argued that some applications are more intensive than others-therefore more judder. But I don't think this is the case.
Also:
Does anybody responding have judder issues? If so, what is your setup, and what demos/games are giving you the problem?
The problem is varies enough that it can't be blamed on the way the brain works. The closest I could use to describe it is like a kind of vertical V-sync tearing. Which obviously points to a refresh rate problem, but doesn't explain why it only gives you problems when looking around.
I want to see if there are common linking factors involved with the issue. I forgot to mention that I am also using Windows 7 64bit. I did have an issue before a fix came out for the rift software, where my rift wasn't being detected in the software properly before an update was put out. I know many people didn't have this problem. Perhaps an Oculus rep can share a bit of what was fixed to stop that problem?


I could be wrong, but I think what you are experiencing is low frame rate, which looks like judder for the reasons mentioned above. If you are dropping below 75FPS, its basically the same as dropping below 75Hz (low persistence). Some demos will look fine because you are hitting 75FPS.

As I say, I may be wrong - but if possible, find out what your frame rate is on applications that give you "judder". I know that on Titans of space, when I am seeing 75FPS on the indicator, its pretty judder free, then gets worse under 75. DCS world I am nowhere near 75FPS and its a judder fest. We really need a "Rift FPS counter" that slaps the number up in the middle of the screen 😄

if you want to get to the bottom of judder, then you need to eliminate variables. "My card should be able to do it" is not eliminating a variable, having a visible frame rate counter is.

Scofthe7seas
Honored Guest
I get over 100 fps on sightline, and both amusement park rides were fixed with switching to DX11 mode. Now, it can be argued that I am only getting over 75 fps on a basic unity demo by using dx11, because I don't have a FPS counter like you mentioned. But I can't imagine a GTX780 would get less than 75 fps on those applications simply because the DX version is different.
Also, many people's judder issues are cleared up by switching the rift to their main display. Switching displays shouldn't make any difference in framerate.
Then there's the worst culprit of them all, minecrift. Although, again, I don't have an FPS counter, I have gotten well over 200 fps on regular minecraft.
(I do not get any judder at all on Titans)

Wireline
Explorer
"Scofthe7seas" wrote:
I get over 100 fps on sightline, and both amusement park rides were fixed with switching to DX11 mode. Now, it can be argued that I am only getting over 75 fps on a basic unity demo by using dx11, because I don't have a FPS counter like you mentioned. But I can't imagine a GTX780 would get less than 75 fps on those applications simply because the DX version is different.
Also, many people's judder issues are cleared up by switching the rift to their main display. Switching displays shouldn't make any difference in framerate.
Then there's the worst culprit of them all, minecrift. Although, again, I don't have an FPS counter, I have gotten well over 200 fps on regular minecraft.
(I do not get any judder at all on Titans)


Hehe I wish that were true about the 780 😄 There are many folks with absolute beast cards that see drops below 75 in certain games. They will also see peaks above as well, but there has been a real variation in performance. Having a spanky card doesn;t seem to be a guarantee, which is why the ability to have an FPS counter up would be so useful.

Re: switching to main display, for some this comes from having a 60Hz main monitor. If you do not set your rift to primary, it will cause it to run in 60Hz due to some quirk of windows / hardware. If you have a 75Hz monitor you will not see this, and if you set to primary you will not see this. Of course if your monitor is 75Hz and set to it in your Nvidia / AMD settings panel, then I am talking outta mah butt .... but that happens a lot lol.

As an example, I can get a really nice smooth experience in half life 2. I can get the FPS counter up in that and it shows me I have 200+ fps. Its lovely 🙂

There may well be more to it of course, but if you can eliminate frame rate drops from the equation then it will help on your judder hunt 😄

Scofthe7seas
Honored Guest
As you said, without the frame counter, it's difficult to prove the issue (although sightline does actually have a frame counter)
But there just isn't any chance that those two rides are having frame problems. They are fairly low poly unity demos. Then there is also Minecrift as I mentioned. Impossible to be fps related, and the judder is HORRIBLE in that. I realize that having a good card doesn't guarantee performance, but unless some of those unity based demos are coded in the absolute most inefficient way possible, there's no way I'm getting below 75 fps, regardless of DX version. There aren't many FPS counters for rift demos, but my card seems to do alright with most modern games.
Then there is the issue of the DK1 never having any problems with this (that I've heard of.) People were trying every game under the sun with the DK1, and absolutely getting under 60fps consistently, but not reporting any problems with judder. (never even heard the term used in this circumstance regarding the DK1) Another noteworthy exception is a game called "world of diving" It's HORRIBLY optimized, and I get super low framerate, and judder, but everybody that plays it says they get poor framerates, and many are not experiencing the issue at all. Again, I'm trying to get a list of all potential factors; is there anybody who is reading this that is not having the problem?

Lagahan
Explorer
Some judder can be FPS related but mine is certainly not. Its a 3-4 frame jump backwards every couple of degrees of head rotation that really kicks you in the nuts. Doesn't happen on anything other than Unity demos for me though!
DK1 status: Delivered, DK2 status: Delivered Rigs: CPU: i5 2500k @4.8GHz GPU: R9 290 @1.1GHz RAM: 16GB OS: Windows 7 Pro x64 CPU: i7 4710MQ 3.3GHz GPU: 2x880M 8GB RAM: 8GB OS: Windows 7 Pro x64

Cgpnz
Honored Guest
I hope OVR tries to go back to the 0.3.x sdk. Why the runtime, is there some inter process lagging
going on? By 'inter process' I suggest that it could allow the operating system to do its usual task switching thing, and so
interrupt a frame or two now and then- screen flashing.
Is the Low P. frame banding interacting with low frame rate the issue. So then why not offer the option to
turn off Low P. Someone said F1 or F5 does that (I have reverted back to my RiftUP! which does the scenery
in the same resolution but no judder), but I doubt it, otherwise it would be widely known temporary solution.

So far low or moderate triangle count demo's are run with beastie gpu cards for acceptable results.
CV1 is looking impossible if this keeps up.

As Cybereality keeps saying OVR are working on it, so such crizzling won't help matters, so we are just venting here.
I just hope they are in a mood to do serious development reversals and say go back to 0.3.x sdk, or offer to turn off
Low P. in the mean time.

jherico
Adventurer
"Cgpnz" wrote:
I hope OVR tries to go back to the 0.3.x sdk. Why the runtime, is there some inter process lagging
going on? By 'inter process' I suggest that it could allow the operating system to do its usual task switching thing, and so
interrupt a frame or two now and then- screen flashing.


It's not caused by the runtime. It's caused by either low framerates or v-sync issues in virtually all cases. V-sync problems should be fixed by direct HMD mode or possibly by the new nVidia VR direct stuff, or by using a Direct3D wrapper for OpenGL. Low framerates are up to the applications to fix.
Brad Davis - Developer for High Fidelity Co-author of Oculus Rift in Action

Cgpnz
Honored Guest
"jherico" wrote:

It's not caused by the runtime. It's caused by either low framerates or v-sync issues in virtually all cases. V-sync problems should be fixed by direct HMD mode or possibly by the new nVidia VR direct stuff, or by using a Direct3D wrapper for OpenGL. Low framerates are up to the applications to fix.



I question this diagnosis.

My proof is a Tuscany.xml engine run with lots of triangles (I have a perl sdk that can create geometry here).
Very smooth in the RiftUp! at the same resolution, judders to hell with the DK2.

Whats the difference? Same video signal. Same frame rates, Same V-sync setting, whatever it was.

Cgpnz
Honored Guest
If the DK2 and CV1 runtimes are unworkable with low framerates, then perhaps the consumer OVR prospects are in trouble.
So to do CV1 with the expected triple-A gaming environments you will need quad-sli gtx980s.
Please.....