Remap Toush buttons/trigger to "Fire1"
Anyone know how to easy convert the touch trigger to "Fire1" (Left mouse button) Default in the SDK, button.ONE on the toush controller is lunching my projectile using old script with "Fire1" (Left mouse button) if (Input.GetButtonDown("Fire1")) { Is there an easy way to remap: "Button.One" = "Fire1 / LMB" to: "PrimaryindexTrigger" = "Fire1 / LMB" Could this be done in the OVRinput.cs skript without changing the "Fire1" script? / Best regards from the Dactyl Nightmare dev. Fredrik2.5KViews0likes3CommentsOVRInput.GetDown never returns true in Unity 2018.3.5? Oculus 1.35
So I'm attempting to integrate the Oculus SDK into my project so I can release my game on the Oculus store. I started a new project to get my bearings and got so far as to place the OVRCameraRig into my scene and start detecting the touch inputs. I can detect OVRInput.Get just fine, but OVRInput.GetDown/GetUp never return true. I've googled and searched these forums as well as the Unity forums and come across other people having the same issue(as far back as 2017) with none having a solution. I've tried the workaround some have posted of hitting the home button and then returning to Unity and it has no effect for me. I've tried with both Oculus Integration 1.28 and 1.35. Does anyone have any idea how I can get this working?1.1KViews0likes2CommentsLaser Pointer instead of Gaze Pointer
Hi All, In Unity I tried to replace laser pointer using Oculus touch instead of gazepointer like present in the samples. The problem is that I don't know how to trigger events in UI components. Is there somewhere an example to get inspiration? thanks12KViews0likes4Commentshow to check if object is grabbed by specific controller?
Below is pseudo-code which demonstrates my desired functionality. The code needs to be placed in on the grabbed object. The part in Bold is what I need clarification on. if (_OVRGrabbable.grabbedBy == "LTouch") { //Do something } EDIT: Solution : Public OVRGrabber LeftGrabber; // Drag grabber to inspector field if (_OVRGrabbable.grabbedBy == LeftGrabber) { //Do something }4.6KViews0likes2CommentsStop touch controllers sleeping?
Hello all. In my Unity application, a lot of people who demo the app are quite inexperienced with VR and most tech (older generation). A lot of them struggle to get used to the touch controllers, and often don't rest their fingers on any buttons when they are using them. This causes the controller to be become inactive in Unity after about 2 seconds of not touching any buttons while holding it relatively still. If the left controller deactivates itself, then the right becomes the primary controller by default, and some functionality gets mixed up - the right trigger for instance will do what the left trigger was assigned to do. This is frustrating if they are using the right controller index trigger to do something while the left is still, and the left deactivates and suddenly that same right controller trigger performs the wrong action. I looked around in the OVR player controller scripts etc. and haven't found any 'sleep' timer which dictates how long until the controllers sleep. I also haven't seen any settings in the Oculus home app which I can toggle, unless I missed something obvious... I want to avoid modifying the controller to make it not sleep, is there any other workaround that someone can suggest? EDIT: I solved the problem. It was actually really simple :) I have to use OVRInput.rawbutton instead of button, to prevent it from switching around when one controller deactivates.960Views0likes0CommentsOculus Touch controller registered as mouse input by Unity
Hi everyone, I've been working on a project in Unity where one person uses the Oculus HMD + Touch controllers to interact with the virtual environment, while another person monitors the first via PC (so on a monitor). The latter uses a (self-made) interface to enable / disable certain components for the person in VR; this interface is made with Unity's own UI elements (so: Buttons, Inputfields, Sliders, etc.). For some reason, one button of the Oculus Touch controllers - the B button on the right controller - simultaneously functions as mouse input. To give a better description: when the mouse on the PC is hovering over a UI component (e.g. an Inputfield) and the user in VR presses the B-button on the Right Touch controller, the UI component will be activated as if clicked on by the mouse. My question: has anyone encountered this before? How can I disable this? To give some further information on my predicament: I am using Unity 2017.2.0f3 I am using the OpenVR / SteamVR for Unity plug-in as the project needs to be compatible with both HTC Vive and Oculus Rift. Oculus is supported by this plug-in - the plug-in automatically recognizes the Oculus Rift and -Touch controllers when those are connected to the computer when the application is running. Input from the Oculus Touch controller is read by the aforementioned OpenVR plug-in - in particular, scripts such as "Steam VR_Trakced Controller" are used to read whatever input it receives and interprets it as a HTC Vive controller. This works perfectly fine despite the fact that Oculus Touch controllers are interpreted as HTC Vive controllers - for example, if the Thumbstick is touched, OpenVR will interpret it as that the Pad of the HTC Vive controller is touched. If it is clicked, it will register it as a Pad press. Any help would be very welcome! Thanks in advance!1.3KViews0likes0CommentsMultiple OVRCameraRigs
I have a Scene with 3 objects, each of which contains a bunch of stuff and it's own camera. Originally I just had 3 cameras, no OVR scripts on them, and when I turn far enough I disable one of the objects and enable another one, so when you turn past a certain point it switches scenes. This was all working fine until I wanted to add Touch controls. I added OVRManager to an object that is always on, and each camera got an OVRCameraRig. If they each have their own TrackingSpace, or if I move the TrackingSpace on switch, the forward direction of the controllers is off by 120 degrees (the amount I rotated). I want to either totally reset the tracking when I switch, or move over my TrackingSpace and reset the forward direction to line it up with the new camera. I've gotten pretty close, by moving the tracking space and then disabling/enabling OVRManager with ResetTrackerOnLoad checked. The controllers work, they are in roughly the right place, but they move along with my head when they shouldn't. Is there a better way to do this that still let's me keep the separate cameras? The cameras all have different scripts/settings etc. so I don't want to use a single one if I can avoid it.681Views0likes0Comments[SOLVED] Click events fired twice on Gear VR
Hello, I am having a strange problem since the latest Oculus/GearVR update (not sure which one it was :blush: ). When I click on a button in Editor it works as expected, same when running the app in dev mode on the phone. But when I put the phone into Gear VR, clicks/taps are executed twice. I tried debugging via Wi-Fi and adb printed debug text from the same function twice (looks like in different frames though). I am using OVRCameraRig, GazePointerRing (from sample framework) and OVRInputModule. Buttons are standard ones from Unity's UI. I have two apps with this problem. One is made in Unity 5.4.2f2 (Oculus Utilities v1.9.0, OVRPlugin v1.8.0) and another in Unity 5.5.0f3 (Oculus Utilities v1.10.0, OVRPlugin v1.10.0). I tried to run older version of the app made in Unity 5.4.2f2 and it has the same porblem. I know that it worked before. Here is additional information on GearVR/Oculus software installed on the phone: Samsung Galaxy S6, Android 6.0.1 Gear VR Service 2.6.41 Gear VR SetupWizard 2.3.20 Gear VR Shell 2.3.02 Gear VR System 1.1.05 Oculus 2.26.7 Oculus System Activities 1.11.0-45918658 Oculus System Driver 1.0.4.2-45411680 You can see the OVRInputModule settings in the image below. "Invert Swipe Y Axis" setting was added by me but it doesn't affect clicks. Any help appreciated!7.4KViews0likes14CommentsChack if grabbed
I want to make a game where you use joysticks to move a mech and I want that when the joysticks are released that they just go back into their primary positions. I have searched for hours how to check if the object gets grabbed but failed. if anyone could help, that would be great!542Views0likes0CommentsPlease provide an industry standard teleportation script for developers.
I am sorry to be critical here but I am very disappointed in the teleportation example provided in the Unity Sample Framework. It is a very outdated example of teleportation and does not represent current best practices in games such as Arizona Sunrise and Robo Recall. I know I could work through the code and come up with my own, but a search of the internet brings up several examples, tutorials and even code libraries for modern teleportation scripts for the Vive/SteamVR but I can find nothing for the Rift or Touch. Some of these tutorials and code libraries even will work with Oculus but you have to add SteamVR plugin to your project but I don't want to add even more overhead. I would much rather see Unity supporting their developer base and provide code examples for the latest VR standards. I suspect I won't get a reply on this post and that is ok I just needed to vent some frustration. A Loyal Oculus Developer1.8KViews1like4Comments