Forum Discussion
Anonymous
11 years agoUnity + OR + XBox 360 Controller
Hi everyone.
I'm reading the Best Practices Guide and in the Appendix I it is suggested to use a "familiar controller" to let the player well.. play :D
Since I have already the Wireless Receiver for Windows to use the XBox 360 Controller with some games (as Lunar Flight) I want to understand how difficult it is to implement this in Unity; I mean, what do I have to do to be able to move the character with the controller on my project? The prefab in the OVR package is already "controller-friendly"?
In my specific case, since I am making a small flying game, I was thinking about using the left thumbstick to move forward/backward and left/right rotation, the two triggers to change the altitude (yeah is a helicopter-like flight) and the A key to interact.
Cheers guys :D
I'm reading the Best Practices Guide and in the Appendix I it is suggested to use a "familiar controller" to let the player well.. play :D
Since I have already the Wireless Receiver for Windows to use the XBox 360 Controller with some games (as Lunar Flight) I want to understand how difficult it is to implement this in Unity; I mean, what do I have to do to be able to move the character with the controller on my project? The prefab in the OVR package is already "controller-friendly"?
In my specific case, since I am making a small flying game, I was thinking about using the left thumbstick to move forward/backward and left/right rotation, the two triggers to change the altitude (yeah is a helicopter-like flight) and the A key to interact.
Cheers guys :D
13 Replies
Replies have been turned off for this discussion
- SteveTackHonored GuestIt's not difficult if you want to simply code for Xbox controllers, though if you're going for general distribution, ideally you want to support any gamepad. You *could* use the built-in Unity setup dialog to allow remapping, but that seems a bit clunky to me.
Input from a gamepad isn't really related to Oculus Rift, except for the fact that it's a more natural way to control a game when you can't actually see your input devices.
To do it the out-of-the-box Unity way, you'd set up your various inputs. For instance, the left stick up and down would be "Joy1 Axis 2+" and "Joy1 Axis 2-". You can look up the various values for an Xbox gamepad online.
A better way IMO is to use an asset called cInput. It allows you access to any button and axis so that you could create an in-game remapping UI.
Even better than that would be to create a wrapper around cInput so that you can create predefined profiles for common gamepads like Xbox, PS3, PS4, Logitech, OUYA, etc, as well as custom mapping to allow just about anything. With that approach, you can name your controls generic things like "lower face button" or "right trigger" so that the code that uses input doesn't have to worry about the specifics. - AnonymousGot it, thanks for the hint.
Actually yes, it is quite simple in the end, you just have to work on the Project Settings / Input checking the map of the xbox controller (this one: http://wiki.unity3d.com/index.php?title=Xbox360Controller).
(i'm writing this so if anyone needs, well, it's here ^^) - mrgreen72SuperstarAny idea why Oculus didn't use Unity's Input class and went with their own OVRGamepadController?
- drashHeroic Explorer
"mrgreen72" wrote:
Any idea why Oculus didn't use Unity's Input class and went with their own OVRGamepadController?
I don't know the answer, but I'm guessing it was to shield the average developer from having to worry about things like the differences in mapping between Windows and Mac? And maybe even for other controllers down the line. I still haven't tried a controller on my Mac Mini so I don't know.
I think the first couple of iterations of the Oculus Unity integration didn't have an OVRGamepadController and it instead relied on XInput plugins -- they probably ran into or foresaw issues with that. - mrgreen72Superstar
"drash" wrote:
"mrgreen72" wrote:
Any idea why Oculus didn't use Unity's Input class and went with their own OVRGamepadController?
I don't know the answer, but I'm guessing it was to shield the average developer from having to worry about things like the differences in mapping between Windows and Mac?
Oh crap you're right. The Unity/Xbox 360 controller mapping is vastly different across PC and Mac. I didn't even realize that. I guess people keep multiple InputManager.asset files and swap them at build time? It's a bit of a PITA.
http://wiki.unity3d.com/index.php?title=Xbox360Controller
Indeed if the OVRControllerManager works the same on different platforms then that's a big plus. You lose the GetButtonUp and GetButtonDown functionality and have to write it yourself though right?
I'm new to Unity by the way. If that wasn't already clear enough... :lol: - sh0v0rProtege
"mrgreen72" wrote:
Any idea why Oculus didn't use Unity's Input class and went with their own OVRGamepadController?
I have a fair bit of experience with this.
The Oculus Plugin has a Gamepad class that uses XInput which makes it compatible with any X360 controller.
The Unity Input Manager isn't exposed so it's not possible to rebind it at runtime. The only way to do it is to create an aliasing scheme which is what cInput uses. Its a bit clunky but it works. Unfortunately Unity doesn't support a very wide range of Direct Input devices so I had a DInput plugin added to Lunar Flight and wrote my own configuration UI and file format.
My recommendation is to just use the Oculus Gamepad Controller while you're prototyping it is very easy to use and will ensure everything works, assuming you are making the x360 controller required. If you want to remap it you will have to write your own system to do that. - sh0v0rProtege
"drash" wrote:
"mrgreen72" wrote:
Any idea why Oculus didn't use Unity's Input class and went with their own OVRGamepadController?
I don't know the answer, but I'm guessing it was to shield the average developer from having to worry about things like the differences in mapping between Windows and Mac? And maybe even for other controllers down the line. I still haven't tried a controller on my Mac Mini so I don't know.
I think the first couple of iterations of the Oculus Unity integration didn't have an OVRGamepadController and it instead relied on XInput plugins -- they probably ran into or foresaw issues with that.
From what I understand it is still XInput and it doesn't work on Mac, I'm not 100% sure but I think I read another post somewhere about this. The Mac doesn't have a driver for the controller, there is an unofficial one called Tattie Bogle but I don't believe it is supported. I use it with the Mac build of Lunar Flight and have some ugly code to setup the default mapping on Mac. - AnonymousI'm back ;)
Yeah Unity is not so controller friendly as I thought, actually I have a lot of problems with the analogs because it looks like they always have a sort of "recoil" when you try to move into one direction (example: i move to X+, pushing the analog up, and as soon as I move away the finger the character moves X- a bit) - I think i'll stick to keyboard for now.
And maybe i'll try to use XInput later, but i'm running out of time to create the beta version of the game. - mrgreen72SuperstarIt seems you simply didn't put a large enough dead zone. The idea dead zone will vary from controller to controller depending on the beatings they received but a value of 0.3 should work well for all of them.
- AnonymousI tried with the "dead zone" but it didn't work well.
Still, this morning I tried with another "generic" usb-cable controller and the problem didn't show up. I'll try it again with the new version of the project tomorrow, maybe it was just that in the project I had the problem something wasn't working properly.
I'll keep you update ;)
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device