Forum Discussion
nickoneill
11 years agoHonored Guest
Getting acceleration at better than 60hz?
Working on some demos with the rift again now that we have raw acceleration data (thanks 0.2.5!) but I'm getting capped with orientation at 60hz. Is there a way to uncap that if I build a custom version of LibOVR? I've looked around a bit in the project but C++ isn't my strongest language.
At only 60hz a lot of interesting acceleration data is dropped, it's making for lots of false negative results and there doesn't seem to be much I can do about it when I look at the raw data. I've got a workaround that I can use for the moment (keep an eye on the user interface section in the next couple of days...) but I'd really prefer to have lots more acceleration data!
At only 60hz a lot of interesting acceleration data is dropped, it's making for lots of false negative results and there doesn't seem to be much I can do about it when I look at the raw data. I've got a workaround that I can use for the moment (keep an eye on the user interface section in the next couple of days...) but I'd really prefer to have lots more acceleration data!
8 Replies
- jhericoAdventurer
"nickoneill" wrote:
Working on some demos with the rift again now that we have raw acceleration data (thanks 0.2.5!)
Raw orientation data has always been available by attaching a listener to the SensorDevice object. That's how the SensorFusion type connects to it."nickoneill" wrote:
but I'm getting capped with orientation at 60hz.
I think you're confusing framerates with sensor data handling. If your framerate is locked to vsync you can't hit any higher than 60hz on the Rift. You never want to fetch the sensor fusion data more than once per frame because you don't want to get different sensor fusion results for each eye, or for different objects within the same eye. The sensor device however is sending messages at 1000hz, and they're being processed by the SensorFusion object at that rate, in a background thread."nickoneill" wrote:
Is there a way to uncap that if I build a custom version of LibOVR? I've looked around a bit in the project but C++ isn't my strongest language.
The SensorFusion class has a method:void SetDelegateMessageHandler(MessageHandler* handler);
. If you provide a delegate to your SensorFusion, it will receive all the acceleration messages that the SensorFusion object receives, which defaults to 1000hz."nickoneill" wrote:
At only 60hz a lot of interesting acceleration data is dropped, it's making for lots of false negative results and there doesn't seem to be much I can do about it when I look at the raw data.
Attach a delegate to the SensorFusion object, or if you're not using sensor fusion, directly to the SensorDevice. You will receive messages at 1000hz. However, bear in mind that the messages are processed on a different thread than your main application thread, so access to shared objects needs to be thread safe. - nickoneillHonored GuestThanks! This info is handy.
Just to be clear: are you saying that using SensorFusion will never send data at more than 60hz with GetAcceleration? This makes sense as a main Oculus guideline is "run your games at 60fps" and matches with what I've seen with the SDK. - jhericoAdventurer
"nickoneill" wrote:
Just to be clear: are you saying that using SensorFusion will never send data at more than 60hz with GetAcceleration?
SensorFusion doesn't send anything. If you add a delegate message handler it will pass along the messages that are received from the SensorDevice object, which will get messags at ~1khz. SensorFusion will return you the latest information every time you call one of the getter methods. If you call SensorFusion.getAcceleration() multiple times in rapid succession you might or might not get slightly different values, depending on how much time passes between each call. You can almost certainly force them to be different by including a delay of 2 or 3 milliseconds between the calls. The 60hz limitation is just because the code only loops once per frame on the rendering thread.
What exactly is it you're trying to do with the acceleration data? - nickoneillHonored GuestYes, sorry, I didn't mean to suggest that SensorFusion sends data - I'm requesting it with getAcceleration.
I'm not actually rendering the data directly, I'm updating a websockets bridge to deliver data elsewhere. I didn't make the original project (https://github.com/Instrument/oculus-bridge) but you can see my pull request to add acceleration data in the same way that orientation data is passed through.
You can configure this particular app to call update() as frequently as desired though, it's 60hz by default. I question the ability to get a 1khz data rate from these SensorFusion calls because I can explicitly set the "frame rate" to 120 and simply log the update() calls and see how often they occur. Without any calls to getAcceleration/getOrientation, update() is called 120 times per second. As soon as either call is added in, it goes back down to ~60. From my perspective, it looks like those calls to sensor fusion are blocking on a frame rate that I can't find in the Oculus SDK.
Again, I'm not stellar with C++ so if I'm missing another layer that limits the getAcceleration/getOrientation calls, please let me know.
I'm still working on getting the messagehandler set up in an attempt to work around those calls, I'll let you know the result of that when I'm done. - benpurdyHonored GuestHey there, thanks for the pull request, just wanted to post here to say I've merged your code into the Oculus Bridge repo. The mac and windows builds in the repository are now up to date and will send accelerometer data to the javascript library.
Example code and docs are updated as well.
-Ben - nickoneillHonored Guest
"benpurdy" wrote:
Hey there, thanks for the pull request, just wanted to post here to say I've merged your code into the Oculus Bridge repo.
Thanks Ben! Combining the update data into one message makes more sense too.
I made a short video about the project I was working on using the bridge, it's a demo menu system that you navigate with the accelerometer (by tapping). I posted more details in the user interfaces section. - jhericoAdventurerA friend of mine sent me the link to the techcrunch article on your work. I didn't initially make the connection until I came back here and saw the link to the same video on the other thread.
I've been considering doing similar work along the same lines, but instead of using a knock on the Rift, using actual head motions, such as a quick nod or chin jut, or a quick shake. Not sure how well it will work with the Rift, since it's accuracy depends on how tightly it's strapped on, but it should be recognizable in the accelerometer logs.
If you'd what you were aiming for I could have told you right off that you basically need to create an additional delegate message handler for the sensor device which records the accelerometer data (or a derivative of it) into a circular buffer and then have your main thread inspect that buffer looking for patterns of change or chirps you can identify as the taps. - nickoneillHonored Guest
"jherico" wrote:
A friend of mine sent me the link to the techcrunch article on your work. I didn't initially make the connection until I came back here and saw the link to the same video on the other thread.
Sorry, that's not me! I'm http://nickoneill.name but not http://nickoneill.com :/
I appreciate the tips though. One thing I've learned so far is that processing raw accelerometer data is hard and knocking seemed like the easiest way get started on recognizing accelerometer data. Doing gestures sounds like a much, much harder problem but I am happy to check it out if you're going to make it :)
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 12 days ago