ScreenPointToRay and ViewportPointToRay not working with VR camera
X-posting from the Unity VR forums: https://forum.unity3d.com/threads/screenpointtoray-and-viewportpointtoray-not-working-with-vr.471440/ Environment: - Unity 5.6 - VR is active and rendering to the HMD My goal is to raycast from the mouse into the scene, with a VR camera. When using ScreenPointToRay on a VR camera the ray's direction is way off. Here's an image when my cursor was in the center: When using ViewportPointToRay the ray's direction is much better but still not perfect. Here's an image showing the difference (sorry for mouse pic, my screen capturer wasn't picking up the mouse): Here's the test code to reproduce in a clean Unity 5.6 project and scene: using UnityEngine; public class Raycaster : MonoBehaviour { public Camera sourceCamera; public bool useScreenPoint; public bool useViewport; public Transform viewportSphere; public float sphereDistance; private void Update () { Vector3 mousePos = Input.mousePosition; Ray ray = new Ray(); if (useScreenPoint) { ray = sourceCamera.ScreenPointToRay(mousePos); } else if (useViewport) { float normalWidth = mousePos.x / sourceCamera.pixelWidth; float normalHeight = mousePos.y / sourceCamera.pixelHeight; Vector3 viewMousePos = new Vector3(normalWidth, normalHeight, 0); ray = sourceCamera.ViewportPointToRay(viewMousePos); if (viewportSphere != null) { viewportSphere.transform.position = ray.origin + ray.direction * sphereDistance; } } Debug.DrawRay(ray.origin, ray.direction, Color.red); } } In these screenshots I'm using a single camera set to both eyes. However, I have repro'd the same problem when I switch to left and right eyes too (that's the setup in the actual project I'm working on). To visualize the use case: there is one person in VR and one person not in VR but on the PC using the mouse to click on things the VR person is looking at. The simple solution to this would be to use another camera that displays just to the monitor, however the project is very performance restricted and the new camera would be displaying exactly the same thing as the existing VR camera. Has anybody had to tackle this or have any ideas on how to solve this without an extra camera?2.4KViews0likes0Comments[SOLVED] OVRCameraRig & Use Per Eye Cameras: Turning on and off cursor problem
Hi, I want to turn on/off the OVRCameraRig property Use per eye cameras to display stereoscopic or monoscopic video. I want to use a single camera for both eyes in the menu and then turn on/off the Use per eye cameras option as needed. If I enable this option in runtime, the gaze cursor's (OVRGazePointer) position is not in the center anymore, but half a way between the center and top border. If I enable this option in the object inspector, then the cursor works fine until disabling and re-enabling per eye cameras. Do I need to switch OVRInputModule's Ray Transform object? If so, which one? At the moment I'm having it set to CenterEyeAnchor.3.2KViews0likes4CommentsCursor Lock unavailable ?
Hello, I recently tried to update some old "0.6" app to the new "1.3" drivers - it's a seated experience where the user can look at various UI elements to 'click' them. For this, I lock the cursor in the middle via Cursor.lockState and just use the standard Unity UI (with in-world Canvas). Everything worked fine before, and still work in the editor - but not once built. With a built app, if I have both the Oculus API and "Virtual Reality Support" enabled (in Unity's player options), the LockState stay at "none" - whatever my scripts try to do. Removing the Oculus plugin or unchecking "VR Supported" put thing back to normal, but then the Oculus doesn't work. This is very problematic because it make the whole "look at a button to press it" concept very uncomfortable, as it force the user to manually center his cursor at the app's start (and moving accidentally your mouse mess up everything). This problem appear even on a new, nearly empty project with just the Oculus plugin and a few scripts trying to mess with the cursor's state. Is this an intended behavior, or a bug ? In any case, does someone have a way to correct it ? I could add Colliders to every UI elements and then perform ray test from the camera, but that's basically recreating Unity's UI/EventSystem in a less optimized way.Solved6.1KViews0likes31Comments