How to snap object with OVR Grabbable?
Hello! I've managed to successfully set up grabbing with Grabber but now I would like to go a step further and only allow grabbing at a specific point. In my case, I am trying to pickup a sword only by the handle, and make sure it always faces the correct orientation. I've looked through the documentation for Grabber/Grabbable but I can't seem to find what I need to execute this. Thank you for the help!18KViews0likes15CommentsHow can i switch the laserpointer (UIHelper) from right to the left controller?
When adding the UIHelper prefab from the OculusIntergration, it directly assigns it to the right controller. How can i switch it to the left controller? Do need to change the HandedInputSelector.cs?4.5KViews0likes7CommentsQuest 2 controller 3d models? (Not the ones in .fbx)
I am a user who wants to make 3d printed Beat Saber handle. But I can never get Quest 2 controller's 3d model. The ones provided in Controller art, is weird. They stuck two controllers (L/R) and I can never separate them. People says I need Fusion 360 to open and separate .fbx files, but I don't want to spend that much money just to open .fbx file and separate them. (What the heck, the free personal Fusion 360 does not support .fbx files.) Any ideas?2KViews0likes1CommentRemap Toush buttons/trigger to "Fire1"
Anyone know how to easy convert the touch trigger to "Fire1" (Left mouse button) Default in the SDK, button.ONE on the toush controller is lunching my projectile using old script with "Fire1" (Left mouse button) if (Input.GetButtonDown("Fire1")) { Is there an easy way to remap: "Button.One" = "Fire1 / LMB" to: "PrimaryindexTrigger" = "Fire1 / LMB" Could this be done in the OVRinput.cs skript without changing the "Fire1" script? / Best regards from the Dactyl Nightmare dev. Fredrik2.5KViews0likes3CommentsAxis Mappings in UE4 does not work using Oculus Rift
Resently I start to learn and use UE4 and Oculus Rift. I am using UE 4.25 and Oculus VR 1.44.0. My project is attached. I have a trouble in axis mapping. Here's my settings: When I press W or S everything is OK. However, none of the Oculus Touch buttons work. Wearing the Oculus Rift HMD, I can see my hands moving, and the Touch buttons work in other games. So I think the problem must be caused by UE. I googled but got nothing. I would be very grateful if somebody could please help me.642Views0likes0CommentsOculus Touch, custom hands jitter/stutter even with example projects
Greetings Oculus forum, Running into an issue where if I try to use any of the setups with custom hands they jitter or stutter really badly. The sample framework projects also show this behavior for me. CustomHands, DistanceGrab, and a few others. Using the LocalAvatar prefab that uses the default hands seems to work ok, but I am not sure what the issue is. I am running the newest Oculus Integration and Unity 2018.3.3f1 I tried loading a blank project and not touching any settings and I see the same issue. Anyone have some thoughts on this? Thanks1.8KViews0likes5CommentsPointing with CustomHand while near Physical UI
Hello. :) I am trying to create a physical UI and want to make my hand automatically go into the pointing position where a rigidbody is added. I see that when the "Flex" parameter from the animator is 1.0f and the others are not affected, the hand can interact with physical elements. For the simplicitiy of the user, I want the hand to flex into a pointing position automatically when he is near a physical UI, instead of having to press the grip button. In Star Wars Vader Immortal series I really liked that the grip button fingers was gradually squeezing together, when moving the hand closer to a button. I am using Unity version 2018.4.8f1 together with Oculus 1.38.4 and VRTK. I have a pinpad with buttons that use the directional joint drive from VRTK. I am kind of new to Unity and the animator, since I am doing a project for my master thesis, but what I want to do is create a zone near the pinpad and a script that automatically makes the hand go into this "state". The pictures below shows what I want to do when the hand is near the physical UI.615Views0likes0CommentsOculus Touch quick start guide - plus free skinned hand models
Hey, so you just got Touch? Fantastic. I thought I'd make a quick list of points to help you get going with Unity integration. This might be obvious stuff to others but I thought with the number of additional devs receiving Touch kits and Oculus continuing to hand them out this might benefit some. And Cyber [or anyone else], please correct me if I'm wrong about the following stuff: 1.First up, grab the latest Oculus Unity Utilities and Touch integration. This stuff is still evolving so to avoid errors and snags, make sure you're up to date. See the ReadMe file in the OvrTouch folder for these points and more info. 2.In your project, once the Touch sample package is imported, bring in the Touch prefabs and connect up the right and left hand anchors. You can find the prefabs in the OvrTouch/Content/Hands/ folder. Select the prefabs one at a time and drag the appropriate HandAnchor transform to the Touch Anchor field on the prefab. 3. To prevent odd movement glitches and update issues, it's recommended that you parent the hand prefabs into the PlayerController hierarchy, preferably under the camera coordinate space. Under the TrackingSpace node seems to work well for me. 4. Set: Edit->Project Settings->Time->Fixed Timestep = 0.01111111 to reduce judder and hand tracking latency. 5. Don't forget to try the sample scenes in the OvrTouch/Content folder. OvrTouchDemo and TapeMeasure in particular for a taste of the haptics capability. 6. Now to the fun stuff. I think Oculus prefer that we not use the blue sample hands in projects we distribute or show to others. I skinned up a pair of replacement hands you're welcome to use if you like. They're a bit higher resolution than the samples [and let's face it they're a pair of white male hands which may not suit you or your project!] but you can at least get going with them if you want to move away from the blue ones. They look like this: Get them from my dropbox here: https://www.dropbox.com/s/v71kr0p5ooaxvhz/HumanHands.zip?dl=0 In the zip file you should find the following items: r_hand_skeletal.fbx <-- replace the identical item in your OvrTouch/Content/Hands/Models/ folder with this l_hand_skeletal.fbx <-- replace the identical item in your OvrTouch/Content/Hands/Models/ folder with this humanHand_3D_COLOR.jpg <-- put this wherever you prefer to keep textures humanHand_3D_NRM.jpg <-- put this wherever you prefer to keep textures humanHandSkin.mat <-- put this wherever you prefer to keep materials These hands come from a blender model created by SuperDasil: http://www.blendswap.com/blends/view/81285 licensed under the Creative Commons. I have remodeled them a bit to match the Oculus touch sample joint proportions and capped the ends to be a little more pleasing to the eye. Should updated Oculus Touch samples come along with significant changes I can update these to match. 7. Finally you'll want to grab things in your own project huh. Items you wish to interact with need to be rigidBody physics objects which means you'll want to setup the usual components required for that. This may also mean you need to set up colliders etc so your objects don't fall through the floor when you put them down [or throw them for fun]. Simply add the supplied Grabbable.cs script [OvrTouch/Script/Hands/] to your object and you should be good to go. That should get you up and running. If anyone else has useful info to share please do, the more the merrier. -julian12KViews0likes15CommentsHow to get the information from Oculus touch analogue in C#
I am trying to figure out how to get the information from the analogue so I can register footstep sounds by the amount of which the player is moving. I don't want the steps to be inconsistent so I want the sounds to slow down or speed up depending how much the player is moving the analogue. I have no idea on how to write the code for this. Can anyone point me in the right direction?Solved1KViews0likes2Comments