cancel
Showing results for 
Search instead for 
Did you mean: 

Oculus Distorts Raycast

Metagen
Honored Guest
So essentially I'm creating an on-rails first person shooter using the Oculus and Leap Motion.
Adding the Oculus causes the raycast that controls the hit detection to veer off target. I've found a couple of threads that talk about this problem but none that sufficiently solve my issue.

The set up is that the weapons and shooting script are attached to the right camera. The ray is cast using a simple

InteractionBox iBox = frame.InteractionBox;
HandList hands = frame.Hands;
Hand firstHand = hands [0];
Vector stabilizedPosition = firstHand.StabilizedPalmPosition;

Vector handCenter = stabilizedPosition;
Vector normalizedPosition = iBox.NormalizePoint(handCenter);
float x = normalizedPosition.x * screenWidth;
float y = normalizedPosition.y * screenHeight;

Camera.main.ScreenPointToRay(new Vector2(x,y));
if (Physics.Raycast (ray, out hit, Range))
{
//hit stuff
}

with x and y being normalized positions of the player's hand and only the right camera is currently tagged as the main.

It might also be worth noting that the crosshair solution I'm currently using is the VRGUI solution created by boone188 on these forums (Available here: https://developer.oculusvr.com/forums/viewtopic.php?f=37&t=4944 )and that the crosshair controls are currently working independently of the hit detection.

So what I want to know is there any way to negate the distortion effect?
8 REPLIES 8

owenwp
Expert Protege
You should avoid doing math in screen space, because it doesn't meaningfully correlate to a stereoscopic view. Try mapping your leap motion data into 3D space somewhere in front of the player (and this should probably be independent of where the player is looking), then do your raycast in 3D space.

It is very important in VR to think of UI in terms of positioning things in space, rather than plotting points on a screen, because if the optics are done right then the user isn't even aware of the screen's existence.

LeapMotion
Honored Guest
@Metagen owenwp is correct. The Oculus displays in a conical output, meaning that because it distorts dimension, so it is very important in VR to think of UI in terms of positioning things in space, rather than plotting points on a screen.

Metagen
Honored Guest
I've updated the code to show my original approach. I'm using the InteractionBox to detect the hand position so I think that should translate easily enough to a raycast to detect a hit accurately but I can't seem to get it working even without using ScreenPointToRay.

My last attempt used
direction = cameraTransform.TransformDirection (Vector3.forward);
Vector2 cursorPos = new Vector2(x,y);
NextFire = Time.time + FireRate;
if (Physics.Raycast (cursorPos, direction, out hit, Range))


Any suggestions?

owenwp
Expert Protege
There are a few problems with that code. You are casting all rays in the same direction, which would only work with an orthographic camera. Your rays are originating in some X and Y world space position, which is not correlated with the position of the camera.

What you need is to fire rays from the camera position, in the direction of the hand relative to the camera.

Metagen
Honored Guest
Okay, so I've progressed to the point where I'm sure that it's at least tracking my hand movements mostly correctly. However, the raycast doesn't seem to be following the camera's movements exactly as it should and there also still seems to be some sort of distortion in that straight up and down movement seems to cause a curve as well as being off-center to the sides. Even the Debug.DrawRay doesn't seem to be matching up with where the hit markers appear.

Vector normalizedPosition = iBox.NormalizePoint(handCenter);
float x = normalizedPosition.x;
float y = normalizedPosition.y;
float z = normalizedPosition.z;

Vector3 cameraPos = transform.position;
Vector3 cameraDir = transform.forward;
Vector3 cursorPos = new Vector3(x,y,z);
Vector3 direction = cameraDir + cursorPos;

//denormalize
float directionx = direction.x * screenWidth;
float directiony = direction.y * screenHeight;
float directionz = direction.z * screenWidth;
Vector3 rayDirection = new Vector3(directionx,directiony,directionz);

if (Physics.Raycast (cameraPos, rayDirection, out hit, Range)){
}


Honestly, I'm absolutely stumped by this :oops:

vajra3d
Honored Guest
as well as being off-center to the sides.


Have you taken the right camera's off-centered position into account?

Metagen
Honored Guest
"vajra3d" wrote:

Have you taken the right camera's off-centered position into account?

How would I negate the off-center positioning?

I think there's something more to it than that as after looking at it again this morning I'm only noticing that it doesn't give me the full upward/downward movement range either. So there is tracking going on in that it reacts to head movements by moving the area that the raycast is active in but the range of motion is limited and not accurate.

I'm getting more accurate results from the first approach as well as the proper range of motion but I think that's probably a fundamentally flawed approach.

Metagen
Honored Guest
So I've implemented a really hacky fix (sort of).

if (x>0.5)
ray = Camera.main.ScreenPointToRay(new Vector2(640 + x - (320 - Mathf.Abs (x) )/5, y ));
else if (x<0.5)
ray = Camera.main.ScreenPointToRay(new Vector2(640 - x - (320 - Mathf.Abs (x) )/5, y ));
else
ray = Camera.main.ScreenPointToRay(new Vector3(x,y,z));


The above is coupled with a spherecast rather than a raycast in an attempt to mask the inaccuracy of the system. It still veers off further to the left by an increasing amount the further you move your hand from the center of the screen.
While it is certainly hitting more now and is a lot more usable than before it is far from an optimal solution.

Can anyone suggest a way to improve the above equation, perhaps a way to proportionally change the values based on how far off target it is?