Forum Discussion
mdk
13 years agoHonored Guest
Razer Hydra and Unity
I've ordered the Rift devkit and Razer Hydra and plan to do some experimental stuff in Unity 3d. I'm not an experienced programer, but I've managed to learn Unity well enough. I have managed to do stuff like NPC pathfinding, line of sight checks and basic gun controls for a shooter. The pathfinding I did is obviosly redundant now since unity has it's own navmesh pathfinding system.
With the Razer Hydra and Rift I would obviosly want to try some first person gameplay. I have access to Unity 3.5. The problem is while there is rift support for Unity 3.5 the Razer Hydra unity plugin from the Unity asset store requires Unity 4. I will offcourse be getting the 4 month test license for Unity 4, but that is only a short term solution.
I would like to know how hard it would be to get Razer Hydra working with Unity 3.5. I would also want to know how difficult it is to make basic controlable hands In unity 4 and the sixense unity plugin. By basic controlable hand I mean the control scheme that is shown in The Gallery: Six elements or the Improved Tuscany demo. I understand that Unity Rift integration is at the point that you can just drag and drop a rift supported first person controler in to a scene that works automatically. Does the Sixense plugin have similar functionality?
I'm also curious on how dificult it is to have the hand objects collide with geometry. I'm thinking of a control scheme where you have 2 pairs of hands. One pair of "physical hands" that collide with geometry and another pair of transparent ghost hands that show the actual position of the Hydra Controlers. When the hands are not colliding with anything the physical and ghost hands will obviosly be in the same position and the ghost hands would not be rendered. This could be used in gameplay mechanics. For example in a sword fighting game having the physical and the ghost hand's out of sync would make the player lose stamina faster and perhaphs decresase hand strenght. So if a enemy blocks an attack the player has to react and stop his sword in the impact point. This would make the gameplay similar to what actors do. The player would need to pretend that he is actually carrying a heavy sword for instance.
Here is a shooting game test I did in the past. This might give some indication on what I might be able to accomplish with the Rift and Sixense plugins in unity.
https://dl.dropboxusercontent.com/u/277 ... ipeli4.zip
With the Razer Hydra and Rift I would obviosly want to try some first person gameplay. I have access to Unity 3.5. The problem is while there is rift support for Unity 3.5 the Razer Hydra unity plugin from the Unity asset store requires Unity 4. I will offcourse be getting the 4 month test license for Unity 4, but that is only a short term solution.
I would like to know how hard it would be to get Razer Hydra working with Unity 3.5. I would also want to know how difficult it is to make basic controlable hands In unity 4 and the sixense unity plugin. By basic controlable hand I mean the control scheme that is shown in The Gallery: Six elements or the Improved Tuscany demo. I understand that Unity Rift integration is at the point that you can just drag and drop a rift supported first person controler in to a scene that works automatically. Does the Sixense plugin have similar functionality?
I'm also curious on how dificult it is to have the hand objects collide with geometry. I'm thinking of a control scheme where you have 2 pairs of hands. One pair of "physical hands" that collide with geometry and another pair of transparent ghost hands that show the actual position of the Hydra Controlers. When the hands are not colliding with anything the physical and ghost hands will obviosly be in the same position and the ghost hands would not be rendered. This could be used in gameplay mechanics. For example in a sword fighting game having the physical and the ghost hand's out of sync would make the player lose stamina faster and perhaphs decresase hand strenght. So if a enemy blocks an attack the player has to react and stop his sword in the impact point. This would make the gameplay similar to what actors do. The player would need to pretend that he is actually carrying a heavy sword for instance.
Here is a shooting game test I did in the past. This might give some indication on what I might be able to accomplish with the Rift and Sixense plugins in unity.
https://dl.dropboxusercontent.com/u/277 ... ipeli4.zip
12 Replies
- cyberealityGrand ChampionWell I think it's all possible, it's just a matter of how skilled you are and how much time you have on your hands.
Also, I got a kick out of playing the game you posted. Do you plan to port that to the Rift? - mdkHonored Guest
"cybereality" wrote:
Also, I got a kick out of playing the game you posted. Do you plan to port that to the Rift?
I'm going to use that game as a base for my rift testing, but it will be quite different. I have allready converted it to a first person perspective and changed the enemy AI to use unity's navmesh pathfinding instead of the one I made myself. I might test the original birdseye view at somepoint to see how well it works in VR. However it will still take a month or 2 for me to get the rift depending on how fast Oculus will ship these things. Right now I can just prepare things in the game that don't need the rift or hydra. - jerrydeanrsmithHonored Guest
"mdk" wrote:
"cybereality" wrote:
Also, I got a kick out of playing the game you posted. Do you plan to port that to the Rift?
I'm going to use that game as a base for my rift testing, but it will be quite different. I have allready converted it to a first person perspective and changed the enemy AI to use unity's navmesh pathfinding instead of the one I made myself. I might test the original birdseye view at somepoint to see how well it works in VR. However it will still take a month or 2 for me to get the rift depending on how fast Oculus will ship these things. Right now I can just prepare things in the game that don't need the rift or hydra.
I like the bird's eye view. Would you be able to control the bird with the hydra? the joysticks mapped to forward,back,left,right and the position of the hand to control the altitude?
I am focusing on getting buildings,statues, lamps, bench, ect integrate because they are independent of the hydra and the oculus. I mention this because still have not received oculus.
The hydra forums have sixence plugins that can be imported into unity. The hand tracking works but finding it difficult to update the position of the hand relative to main camera. Made some edits to the scripts and the hand moves with camera but not in the way I want. Still working on this.I have contacted the Razer team and they are going to post some useful information on how they accomplished the hand implementation for the tuscany gameplay demo. - mdkHonored GuestMy Hydra finaly shipped. Now I'm faced with a problem. Rift plugin works with both Unity 3.5 and 4, but sixeense plugin requires Unity 4. So should I get the 4 month version of unity 4 now or wait for a month or 2 until I get the rift devkit.
- dbuckHonored GuestYou can use the sixsense plugin in 3.5.7, you just need to re-do or change the hand controller, as that is built using mechanim (4+ only).. It's pretty easy to work with though if you discard the hand animations and just use the SixSenseObjectController script (I don't have the project on this computer to find the exact name, but it's something along those lines)
- MarkVExplorerJust trying to condense options for Hydra + Unity Free vs Hydra + Unity Pro. I've got some time before my Rift shows up and am still on the fence about Unity Pro, so I'm considering my options...
Hydra support in Unity Free = VRPN + UIVA + roll your own Hydra support inside of UIVA
Hydra support in Unity Pro = DLL from Sixense available in Unity Asset store. Requires Unity 4
Add in the Rift requirements...
Did I miss anything?
I believe sockets (or other forms of IPC) are usable in Unity Free. So Sixense SDK running a server + Unity Free with homegrown implementation would be another option? - mdkHonored Guest
"dbuck" wrote:
You can use the sixsense plugin in 3.5.7, you just need to re-do or change the hand controller, as that is built using mechanim (4+ only).. It's pretty easy to work with though if you discard the hand animations and just use the SixSenseObjectController script (I don't have the project on this computer to find the exact name, but it's something along those lines)
How do I get the plugin for the older version of Unity? Asset store doesn't allow me to download it unless I have version 4.0.1. - xanderdavisHonored GuestAny word if the Razer Hydra will work with Unity Mac? I develop on Mac and for Rift am bootcamping into Windows just to create WIP builds. Looking forward to being able to work solely in Mac, and really hope Hydra will work with Mac soon.
- dsky9Honored Guestwe've been doing some fairly extensive development with the Hydras in anticipation of the forthcoming Sixense STEM, as well as a bevy of other 6DoF controllers (Perception Neuron, Sony Move, PrioVR, etc). The Hydra input harness is somewhat convoluted and exists outside and parallel to the standard Unity Input Manager.
Thus, while we map primary axis and buttons as symbolic representations in the Input Manager (i.e. P1-Horizontal, P1-Vertical, P1-Jump...) that handles basic keyboard, mouse, standard joysticks (xbox, playstation). Then inside of our Input handler code, we write custom routines to detect the Hydras, to read their values, and to sub them into the aforementioned symbolic values.
Our best recommendation is to install the Sixense plug-in from the Unity Asset Store, and to thoroughly examine the SixenseInputTest.cs The basic expected vars are :
• SixenseInput.Controllers.Position (Vector3 XYZ)
• SixenseInput.Controllers.Rotation (Vector4 Quaternion)
• SixenseInput.Controllers.JoystickX (analog float -1.0 to 1.0)
• SixenseInput.Controllers.JoystickY (analog float -1.0 to 1.0)
• SixenseInput.Controllers.Trigger (analog float 0.0 to 1.0)
the buttons are a bit more obfuscated, they're something like:
• SixenseInput.Controllers.GetButton(button)
where "button" is one of many values,
ONE, TWO, THREE, FOUR, START, BUMPER, JOYSTICK
representing which "switch" is being closed on that cycle,
This sample script has a bevy of (non-optimized) methods for reading the controllers output in real time, from which you can (in code) map all buttons, joysticks, and 6DoF XYZ YPR data to your app. Hopefully the STEM API will be far more integrated into the standard Unity Input Manager framework, and thus work in seamless parallel with standard controllers, without the need for custom code.
G - dsky9Honored GuestI've expounded upon that forums post and put the detailed code variables in a blog post, enjoy:
http://blog.dsky.co/2015/05/16/razer-hy ... ol-syntax/
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 13 years ago
- 13 years ago
- 13 years ago
- 12 years ago