Remap Toush buttons/trigger to "Fire1"
Anyone know how to easy convert the touch trigger to "Fire1" (Left mouse button) Default in the SDK, button.ONE on the toush controller is lunching my projectile using old script with "Fire1" (Left mouse button) if (Input.GetButtonDown("Fire1")) { Is there an easy way to remap: "Button.One" = "Fire1 / LMB" to: "PrimaryindexTrigger" = "Fire1 / LMB" Could this be done in the OVRinput.cs skript without changing the "Fire1" script? / Best regards from the Dactyl Nightmare dev. Fredrik2.5KViews0likes3CommentsOculus Touch, custom hands jitter/stutter even with example projects
Greetings Oculus forum, Running into an issue where if I try to use any of the setups with custom hands they jitter or stutter really badly. The sample framework projects also show this behavior for me. CustomHands, DistanceGrab, and a few others. Using the LocalAvatar prefab that uses the default hands seems to work ok, but I am not sure what the issue is. I am running the newest Oculus Integration and Unity 2018.3.3f1 I tried loading a blank project and not touching any settings and I see the same issue. Anyone have some thoughts on this? Thanks1.8KViews0likes5CommentsPointing with CustomHand while near Physical UI
Hello. :) I am trying to create a physical UI and want to make my hand automatically go into the pointing position where a rigidbody is added. I see that when the "Flex" parameter from the animator is 1.0f and the others are not affected, the hand can interact with physical elements. For the simplicitiy of the user, I want the hand to flex into a pointing position automatically when he is near a physical UI, instead of having to press the grip button. In Star Wars Vader Immortal series I really liked that the grip button fingers was gradually squeezing together, when moving the hand closer to a button. I am using Unity version 2018.4.8f1 together with Oculus 1.38.4 and VRTK. I have a pinpad with buttons that use the directional joint drive from VRTK. I am kind of new to Unity and the animator, since I am doing a project for my master thesis, but what I want to do is create a zone near the pinpad and a script that automatically makes the hand go into this "state". The pictures below shows what I want to do when the hand is near the physical UI.618Views0likes0CommentsHow to get the information from Oculus touch analogue in C#
I am trying to figure out how to get the information from the analogue so I can register footstep sounds by the amount of which the player is moving. I don't want the steps to be inconsistent so I want the sounds to slow down or speed up depending how much the player is moving the analogue. I have no idea on how to write the code for this. Can anyone point me in the right direction?Solved1.1KViews0likes2CommentsHow can i switch the laserpointer (UIHelper) from right to the left controller?
When adding the UIHelper prefab from the OculusIntergration package, it directly assigns it to the right controller. How can i switch it to the left controller? Do need to change the HandedInputSelector.cs for that?813Views0likes1CommentOculus Integration 1.39 Only one controller active at a time
Since the update, my biggest problem is that while I could manipulate the 2 controllers simultaneously back then, now, only one of them is tracked at a time, the tracking switch between each controller when I press any button and the other is reset at Vector3.zero local position & Quaternion.identity. I get the controller position values from OVRInput.GetLocalControllerPosition(OVRInput.Controller.LTouch) and OVRInput.GetLocalControllerPosition(OVRInput.Controller.RTouch) Of course these two lines returned the right position value back then but now only one of them return a tracked value at a time while the other one seems "disabled".1.4KViews1like3CommentsRubberbanding Locomotion Unity
So I'm quite new to VR development and working with the Oculus. In my project (Unity) I'm using Oculus Intergration from the Asset Store. In there is the Locomotion prefab that Im trying to use. It works but I've run in some problems I can't seem to solve. The first problem is that the Player seems to rubberband from time to time. When I try to teleport to a certain destination it moves back to a different location. Also a problem is that the teleport doesnt always work. If I push the thumbstick forward the laser shows up, but when I let go it doesn't teleport like it does usually. I hope someone can help me!921Views0likes1CommentWhat coordinate system does oculus touch use and where is the origin point of oculus touch?
I am use Oculus rift and touch. I 'm curious about the coordinate system and origin point of Oculus touch. also where can i find information about the coordinate system and origin? For reference, I use unity674Views1like0CommentsDistanceGrabber Hands Lag behind A moving OVRPlayerController
Hello, I am having a problem syncing my hands with the player that is moving around the game via an OVRPlayerController. I have placed both hands as children of tracking space. and when i am not moving via the playercontroller both hands work great. However, when i push the left thumbstick forward and start moving, my right hand will "Freeze" and no longer move with the controller. It will just stay where it was relitive to the Player. The left hand works fine and is able to move around while the player is moving. Both hands are other than their left/right models and refrences to the L/R controllers. So I am very confused why they are behaving differently. Thank You523Views0likes0CommentsHow do you detect Touch button and Trigger presses?
I'm a beginner. Not great scripter. just a hobby. But i got Touch controllers and Unity and I want to detect user pressing buttons or the triggers for a little game i'm making. How do i go about it? I already figured out how to get the Oculus Touch controllers in my game. But i don't know anything about how to detect buttons and triggers. I have in the past figured out how to detect button presses in the XBOX controller. Is it similar way?Solved35KViews0likes4Comments