Forum Discussion
PrimeDerektive
9 years agoExplorer
Debug position of touch child objects not matching "real" position
This is difficult to describe but easy to see.
Basically, I have some "hands" controlled by touch using OVRInput.GetLocalControllerPosition/Rotation, one of which has a sword child object. On the tip of the sword object is another child object, which I've added a sphere mesh to to demonstrate the issue.
I intend to use this object's position to cast a ray, but when I use Debug.DrawRay(transform.position, transform.forward, Color.red) in a script attached to the object, you can see the results are nowhere near where the actual object is (which is getting rendered in its correct position).
I'm a bit stumped here. Anyone have any idea whats going on here?
Basically, I have some "hands" controlled by touch using OVRInput.GetLocalControllerPosition/Rotation, one of which has a sword child object. On the tip of the sword object is another child object, which I've added a sphere mesh to to demonstrate the issue.
I intend to use this object's position to cast a ray, but when I use Debug.DrawRay(transform.position, transform.forward, Color.red) in a script attached to the object, you can see the results are nowhere near where the actual object is (which is getting rendered in its correct position).
I'm a bit stumped here. Anyone have any idea whats going on here?
1 Reply
Replies have been turned off for this discussion
- vrdavebOculus StaffOVRInput.GetLocalControllerPosition/Rotation reports poses in tracking space, not world space. Debug.DrawRay should work properly if you use OVRCameraRig.leftHandAnchor.position/rotation instead of calling the above OVRInput functions.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device