Hand tracking right index pinch counts as button press?
So I have a VRTK (3.3) teleporter pointer setup to work with the controller left button One press. When I setup hand tracking using the OVRHandPrefab and set the Oculus manager to Hands only for some crazy reason doing a Pinch with my right hand activates the teleporter. I cannot figure out why!! The only scripts related to hand tracking i have are on the prefab. And really I want to know HOW its simulating the button press / activating the teleporter so I can use pinches to simulate other button presses! Anyone have a clue how/why this is happening?5.3KViews0likes5Comments[Unity] Instantiate prefab while selecting an object
Hi, How can i instantiate a grabbable prefab when i select a grabbable object? For example, i would instantiate on my hand a sphere after i grabbed a cube, without move the cube. The instantiated sphere will be a grabbable object of course, so i can interact with the world or simply throw it 😄848Views0likes0CommentsProblem: Position constraints of an object with HandTracking (using "OneGrabTranslateTransformer")
(Message a little long to try to explain as well as possible, don't panic, the problem comes down to the question ^^) I also want to say that if you have another solution to this problem, that is to say another method than mine, I'm interested too. The most important thing is that I can implement what I want. Not necessarily the way I imagine it! Thanks in advance for your answers. Hello, In a VR application, where the player uses HandTracking. I want to make him take an object in a bag (The bag is hung on a wall to avoid complicating the problem, and it can't be grabbed). Of course, I don't want the object to go through the bag, so I want to force the player to remove the object from the bottom. For this, I used the script "One Grab Translate Transformer", which allowed me to get the result. I then added a Cube with a box collider, and the option "Is Trigger" checked, to trigger an event when the object comes in contact with it. Here, the event resets the values of "One Grab Translate Transformer" to 0. To disable the constraints. And it works! But unfortunately there is a problem. Of course, after this event, I can move my object in space on the X, Y and Z axes. But my object does not rotate anymore! Why not? Here is a video of the problem : https://youtu.be/U9RYa2u8lsQ In this video (https://www.youtube.com/watch?v=PU8SQ2Obviw), I saw that using a "Box Collider" can prevent objects from passing through other objects. But this doesn't seem to apply to hand-held objects. But I'm probably wrong, so maybe there is a simpler solution on this side. ---- Here is a last video using the objects from the official Oculus example scene, and showing what I want to avoid with a box. Since in the end, my problem can be applied to any box, in any situation: https://youtu.be/WxSXLrMELXw ---- Finally, here is the code: For the "OneGrabTranslateTransformer" script, I didn't change anything, except adding getters and setters to the constraints: public class OneGrabTranslateTransformer : MonoBehaviour, ITransformer { [Serializable] public class OneGrabTranslateConstraints { public bool constraintsAreRelative; public FloatConstraint minX; public FloatConstraint maxX; public FloatConstraint minY; public FloatConstraint maxY; public FloatConstraint minZ; public FloatConstraint maxZ; public bool ConstraintsAreRelative { get => constraintsAreRelative; set => constraintsAreRelative = value; } public FloatConstraint MinX { get => minX; set => minX = value; } public FloatConstraint MaxX { get => maxX; set => maxX = value; } public FloatConstraint MinY { get => minY; set => minY = value; } public FloatConstraint MaxY { get => maxY; set => maxY = value; } public FloatConstraint MinZ { get => minZ; set => minZ = value; } public FloatConstraint MaxZ { get => maxZ; set => maxZ = value; } } … ---------------- For the "FloatConstraint" script, I added a "resetValue()" function that allows to reset the constraint values to 0: namespace Oculus.Interaction { [Serializable] public class FloatConstraint { public bool Constrain = false; public float Value = 0.0f; public void resetValue() { this.Constrain = false; this.Value = 0.0f; } } } --------------- In the Trigger cube script, I simply used the "OnTriggerEnter" function, checking the object tag to avoid triggering the script anyhow: public class EnterTriggerCagoule : MonoBehaviour { public DisableConstraint disable; public string tag = ""; private void OnTriggerEnter(Collider other) { if (other.CompareTag(tag)) { disable.disableConstraint(); } } } ----------------- Finally, this is what my disableConstraint() function looks like: public void disableConstraint() { constraints.MinX.resetValue(); constraints.MaxX.resetValue(); constraints.MinY.resetValue(); constraints.MaxY.resetValue(); constraints.MinZ.resetValue(); constraints.MaxZ.resetValue(); } As you can see, my scripts are not very complicated, and I NEVER touch the rotation values, and I never apply constraints to them. So I am surprised by the problem. Hopefully someone can help me. Sincerely, Xameal.1.4KViews1like0CommentsConverting Touch Controller Position To HandTracking Pinch Position
Hello, I have a scrip on one of my games that creates an object when you squeeze the trigger on the touch controller. I want to do the same thing again, but this time with the new hand tracking controls. I have the hand tracking fully implemented and button presses seem to work great, but I can't seem to figure out pinch gestures or how to get the location of the tip of the index finger. I think it probably has something to do with this though: PointerPose.localPosition = _handState.PointerPose.Position.FromFlippedZVector3f(); PointerPose.localRotation = _handState.PointerPose.Orientation.FromFlippedZQuatf(); //Example code to show current method: void Update() { var hand = GetComponent<OVRHand>(); //bool isIndexFingerPinching = hand.GetFingerIsPinching(HandFinger.Index); bool bDownLeft = OVRInput.GetDown(OVRInput.Button.PrimaryIndexTrigger, OVRInput.Controller.LTouch); //This needs to be if left or right pinch bool bDownRight = OVRInput.GetDown(OVRInput.Button.PrimaryIndexTrigger, OVRInput.Controller.RTouch); if (bDownLeft) { Vector3 position = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(OVRInput.Controller.LTouch)); //This needs to be the vector location of Hand_IndexTip Vector3 rotation = trackingSpace.TransformDirection(OVRInput.GetLocalControllerRotation(OVRInput.Controller.LTouch).eulerAngles); Instantiate(myPrefab, position, Quaternion.Euler(rotation)); } if (bDownRight) { Vector3 position = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(OVRInput.Controller.RTouch)); Vector3 rotation = trackingSpace.TransformDirection(OVRInput.GetLocalControllerRotation(OVRInput.Controller.RTouch).eulerAngles); Instantiate(myPrefab, position, Quaternion.Euler(rotation)); }1.7KViews1like1CommentMirror Finger/Hand Movement
Hello there, I am currently working on a program, that´s supposed to support normal therapy for people with nerve lesions. For that I created a program with different funtcions. The last phase is supposed to show both hands to the user, but while flexing the fingers of the right hand, the fingers of the left should flex as well. The location of the hand in the room is supposed to be independent though. I am working with: - Unity - Oculus Integration - Oculus Rift Would be awesome, if someone can help me. Much love, Max984Views0likes1Comment