Problem: Position constraints of an object with HandTracking (using "OneGrabTranslateTransformer")
(Message a little long to try to explain as well as possible, don't panic, the problem comes down to the question ^^) I also want to say that if you have another solution to this problem, that is to say another method than mine, I'm interested too. The most important thing is that I can implement what I want. Not necessarily the way I imagine it! Thanks in advance for your answers. Hello, In a VR application, where the player uses HandTracking. I want to make him take an object in a bag (The bag is hung on a wall to avoid complicating the problem, and it can't be grabbed). Of course, I don't want the object to go through the bag, so I want to force the player to remove the object from the bottom. For this, I used the script "One Grab Translate Transformer", which allowed me to get the result. I then added a Cube with a box collider, and the option "Is Trigger" checked, to trigger an event when the object comes in contact with it. Here, the event resets the values of "One Grab Translate Transformer" to 0. To disable the constraints. And it works! But unfortunately there is a problem. Of course, after this event, I can move my object in space on the X, Y and Z axes. But my object does not rotate anymore! Why not? Here is a video of the problem : https://youtu.be/U9RYa2u8lsQ In this video (https://www.youtube.com/watch?v=PU8SQ2Obviw), I saw that using a "Box Collider" can prevent objects from passing through other objects. But this doesn't seem to apply to hand-held objects. But I'm probably wrong, so maybe there is a simpler solution on this side. ---- Here is a last video using the objects from the official Oculus example scene, and showing what I want to avoid with a box. Since in the end, my problem can be applied to any box, in any situation: https://youtu.be/WxSXLrMELXw ---- Finally, here is the code: For the "OneGrabTranslateTransformer" script, I didn't change anything, except adding getters and setters to the constraints: public class OneGrabTranslateTransformer : MonoBehaviour, ITransformer { [Serializable] public class OneGrabTranslateConstraints { public bool constraintsAreRelative; public FloatConstraint minX; public FloatConstraint maxX; public FloatConstraint minY; public FloatConstraint maxY; public FloatConstraint minZ; public FloatConstraint maxZ; public bool ConstraintsAreRelative { get => constraintsAreRelative; set => constraintsAreRelative = value; } public FloatConstraint MinX { get => minX; set => minX = value; } public FloatConstraint MaxX { get => maxX; set => maxX = value; } public FloatConstraint MinY { get => minY; set => minY = value; } public FloatConstraint MaxY { get => maxY; set => maxY = value; } public FloatConstraint MinZ { get => minZ; set => minZ = value; } public FloatConstraint MaxZ { get => maxZ; set => maxZ = value; } } … ---------------- For the "FloatConstraint" script, I added a "resetValue()" function that allows to reset the constraint values to 0: namespace Oculus.Interaction { [Serializable] public class FloatConstraint { public bool Constrain = false; public float Value = 0.0f; public void resetValue() { this.Constrain = false; this.Value = 0.0f; } } } --------------- In the Trigger cube script, I simply used the "OnTriggerEnter" function, checking the object tag to avoid triggering the script anyhow: public class EnterTriggerCagoule : MonoBehaviour { public DisableConstraint disable; public string tag = ""; private void OnTriggerEnter(Collider other) { if (other.CompareTag(tag)) { disable.disableConstraint(); } } } ----------------- Finally, this is what my disableConstraint() function looks like: public void disableConstraint() { constraints.MinX.resetValue(); constraints.MaxX.resetValue(); constraints.MinY.resetValue(); constraints.MaxY.resetValue(); constraints.MinZ.resetValue(); constraints.MaxZ.resetValue(); } As you can see, my scripts are not very complicated, and I NEVER touch the rotation values, and I never apply constraints to them. So I am surprised by the problem. Hopefully someone can help me. Sincerely, Xameal.1.3KViews1like0CommentsConverting Touch Controller Position To HandTracking Pinch Position
Hello, I have a scrip on one of my games that creates an object when you squeeze the trigger on the touch controller. I want to do the same thing again, but this time with the new hand tracking controls. I have the hand tracking fully implemented and button presses seem to work great, but I can't seem to figure out pinch gestures or how to get the location of the tip of the index finger. I think it probably has something to do with this though: PointerPose.localPosition = _handState.PointerPose.Position.FromFlippedZVector3f(); PointerPose.localRotation = _handState.PointerPose.Orientation.FromFlippedZQuatf(); //Example code to show current method: void Update() { var hand = GetComponent<OVRHand>(); //bool isIndexFingerPinching = hand.GetFingerIsPinching(HandFinger.Index); bool bDownLeft = OVRInput.GetDown(OVRInput.Button.PrimaryIndexTrigger, OVRInput.Controller.LTouch); //This needs to be if left or right pinch bool bDownRight = OVRInput.GetDown(OVRInput.Button.PrimaryIndexTrigger, OVRInput.Controller.RTouch); if (bDownLeft) { Vector3 position = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(OVRInput.Controller.LTouch)); //This needs to be the vector location of Hand_IndexTip Vector3 rotation = trackingSpace.TransformDirection(OVRInput.GetLocalControllerRotation(OVRInput.Controller.LTouch).eulerAngles); Instantiate(myPrefab, position, Quaternion.Euler(rotation)); } if (bDownRight) { Vector3 position = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(OVRInput.Controller.RTouch)); Vector3 rotation = trackingSpace.TransformDirection(OVRInput.GetLocalControllerRotation(OVRInput.Controller.RTouch).eulerAngles); Instantiate(myPrefab, position, Quaternion.Euler(rotation)); }1.6KViews1like1CommentNeed to re-calibrate viewing like in initial set up!
When doing the initial set up I had the headset positioned wrongly on my head and therefore the calibration is all out of whack.. now that i’ve Figured out how the headset actually fits, I can’t get the screen not to be blurry when correctly positioned.. how can i calibrate the view like in the initial set up (where volume button was also used in calibration)???665Views0likes1Comment[Unity] Fixed start position after game launched
Hello everyone, I'm trying to create a game based on environment of the player and its guardian area ; it's a B2B experience. I want the zero position of my game/scene to always be at the center of the guardian or, at least, on the point where we fixed the floor with the Oculus Touch. I've defined my tracking space to be RoomScale, so the floor is at the correct height. XRDevice.SetTrackingSpaceType(TrackingSpaceType.RoomScale) I'm using the OVRCameraRig from the Oculus Integration assets and disable recenter (I don't want my player to reset its center). The problem is I can't set the zero position of my world by hand, I can't find any option or documentation that would help me. I believe this is possible, I've seen that Dead And Buried Arena made it at Oculus Connect 5 : https://www.youtube.com/watch?v=H713WDWTUDo I will take any help please, Curtis.1.1KViews1like0Comments