Forum Discussion
notch
12 years agoHonored Guest
Oculus Rift game controller input
Hi! I'm trying to add Oculus Rift support to LWJGL so it'll be accessible for 0x10c and Minecraft, and keep running into annoying LWJGL native compilation errors. But when I start up LWJGl with the rift connected, it lists it as a game controller with no inputs, and that made me think..
It would be incredibly cool for the Rift to assign six axis as a game controller input (all rotations and movements). Then it would automatically work in all game libraries that support controllers without requiring special code for it. Is anything like that planned?
It would be incredibly cool for the Rift to assign six axis as a game controller input (all rotations and movements). Then it would automatically work in all game libraries that support controllers without requiring special code for it. Is anything like that planned?
10 Replies
- cyberealityGrand ChampionHey Notch! Glad to have you on the forum.
We're doing the sensor fusion in software rather than on a microcontroller, which means this isn't supported with the developer kit. This may change for the consumer version, and it's something we're considering.
Hope that helps,
- Andres - IsopherHonored GuestSo i have been playing with this a bit, and though i am waiting till my Kit actually arrives to do anything substantial, my thought on the integration is; rather than make it a separate controller, have the rift mimic mouse movements. I was thinking of building a filter for the actual mouse to make the game ignore the vertical motion input to help alleviate some of the disorientation.
However, until i can test this with the actual rift i have no idea how this will affect latency and playability in a real game. Thoughts? - craigotronHonored GuestI apologize for what is clearly off-topic and possibly a social faux-pas, but I couldn't resist the opportunity to speak with someone of Notable Influence.
With your permission, notch, may I direct you to a couple of forum threads with regards to working collaboratively on VR projects?
Craigotron
Player 7048
craigotronprime@gmail.com - steeveExplorerWhat is possible is to write a fake controller using the SDK though
- MrklawExplorer
"steeve" wrote:
What is possible is to write a fake controller using the SDK though
kind of like Xpadder or motionjoy? that'd be good and useful to many apps I think. - CaliberMengskExplorerI agree that using it as a controller device rather then hard coding would make it a better interface for development. That type of system would also make it easier for people developing in unity free.
Right now I'm working on a round about way of getting input to unity free using c# and pipes. I have to build a dll for c# to get the data though, and unfortunately I know very little about c++ to write the wrapper myself, so it could take a while. Plus is though, oculus would work with unity free.
Anyway, to get back on point, this would also be usable by any program that's able to use pipes like C#. It kind of works like a serial connection between programs. I just don't personally know how much lag this would cause for the oculus.
O-o... I have no clue if I'm on topic anymore XD, but that's my two cents. - rupyHonored GuestGreat to hear you're working on an open driver!
My bet would be to use a USB lib to connect and get the raw data directly.
That way we can work on a lower level to get latency as good as possible for different games.
But we need the protocol from Oculus! @cybereality Can you help? Or should we just look at the SDK source?
I'm not going to receive the kit before this summer so unfortunately I can't help yet!
Can you start on the stereoscopic rendering? That HAS to be in LWJGL, the sensor data can be completely standalone, with an interface to different "fusion" modules that contain all the eventual predictive code etc...
But you wan't as little crap between the USB data and the game programmer, otherwise I think latency will kill it.
Edit: Or we could build a JNI fusion module that compiles the data... I'd like to have the raw data in java but maybe it will be too slow... Only Oculus can answer that for now.
@cybereality This should be top priority, not only is minecraft in java but the majority of software developers in the world use Java, and Moores law will make it the game programming language of the future I'm sure! (think ouya)
@notch Did you get my mail? - 38leinaDHonored Guest
Can you start on the stereoscopic rendering? That HAS to be in LWJGL,
Are you talking about making this part of LWJGL core? I personally think this should be application code. It is only really a few lines of code and a fragment shader... - rupyHonored GuestOk, didn't know it was that easy. Even with the fisheye warp and different IPD and cups? If not integrated, atleast like the sensor code; a source demo "bundled" with the LWJGL...
- 38leinaDHonored GuestI posted the code for the stereo rendering with the barrel distortion shader in this post: viewtopic.php?f=20&t=88&start=10
The different lenses and IPDs require just the adjustment of the parameters.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device