Adding ray pointer from Oculus Hands to interact with UI
I am trying to add a ray pointer from Oculus Hands to interact with UI. I managed to get the ray as shown in the screen capture below. However, as you can see, the laser pointer starts from the wrist position and aligned towards the right instead of pointing straight. Appreciate your suggestion if you know how can I correct this to point forward? Also, this laser pointer appears only on the right hand. Is there a way to change this? Please see my settings in Unity below.4.8KViews1like5CommentsRaycast as example "HandInteractionTrainScene
Hello everyone, I'm new to developing it under unity, and I have a question about Raycast. I use the oculus integration of course. I would like to reproduce the parabolic raycast of the "HandInteractionTrainScene" example. The raycast that appears when approaching the windmils. Anyone got an idea? or better, a tutorial? I try to understand via the example but I come to nothing. Thanks in advance574Views0likes0CommentsHand Tracking Teleport
Hi, When using the hand tracking to select and object similar to the train example, selecting the windmills, is there a way to make the selection point to be larger? i.e. If I have a huge game object to select, I have to set my raycast cursor towards middle of the object to be able to select it and not just the collider. Any easy way to set this up? thank you1.5KViews0likes3CommentsLaser Pointer instead of Gaze Pointer
Hi All, In Unity I tried to replace laser pointer using Oculus touch instead of gazepointer like present in the samples. The problem is that I don't know how to trigger events in UI components. Is there somewhere an example to get inspiration? thanks12KViews0likes4CommentsHow do I get my OVRCameraRig to shoot the physics raycaster?
Hi Forum I am developing a physics raycaster from my OVRCameraRig i Unity. I have made a script, and if I change the camera to Camera.main the code runs but nothing happens. I want it to shoot the ray from the OVRCameraRig in the OVRPlayerController, so its visible in VR. Maybe its just a small thing I have to do to make it shoot from the camerarig, but I dont know how to? Can anyone help me? I am not a pro at coding, so please bear with me :) I hope I post it in the right forum806Views0likes0CommentsOculus index finger raycast
Can anyone shed some light on implementing the Oculus hands fingers to shoot a beam out while index finger is extended for selection of ui buttons. I currently have the selection with a reticle and headset gaze but I would like to be able to use the hands instead to make it more user friendly. I am using unity 2017.2 Any suggestions would help3KViews1like4Comments[Feature Request] Integrated Cylindrical Hit Testing if OVROverlay.currentOverlayShape == Cylinder
I've been in the process of trying to retrofit an existing UI using OVROverlay, and I've hit a lot of unnecessary friction dealing with the fact that the cylindrical warping applied to a Render Texture still hit-tests against the original flat plane that the texture was generated from. Ideally, this would just be handled automagically, like with the popular CurvedUI package from the Unity Asset Store. I create my UI using standard Canvas elements in World Space, maybe use an alternate Input Module or add some extra components to the Canvas, and my hit testing script can just send raycasts straight from its forward vector and everything works exactly the same. Presently, I'm copying the CylinderRayTransfer() function from OverlayUIDemo.cs, which is simplistic and assumes it will be used with Physics.Raycast, which allows me to manually specify an origin point, a direction, a distance, a LayerMask, etc. It's being shoehorned into existing hit testing code that uses EventSystem.RaycastAll so that it can call ExecuteEvents.ExecuteHierarchy events like PointerEnter, PointerDrag and PointerClick. This means that it's expecting to simply receive a PointerEventData structure. I'm only a little ways off from completing this, but I really shouldn't need to build a completely separate PointerEventData structure just to hit test cylindrically-warped UI elements vs. other ones. I should just be able to shoot a Ray out from the controller's forward vector and it hits whatever it hits normally in World Space. I shouldn't need to pass in a reference to the cylinder's Transform to perform these calculations. I shouldn't need to be mathematically performing these calculations at all. These operations should be moved directly into OVROverlay and be totally abstracted from me from an end-user's standpoint.698Views0likes1CommentScreenPointToRay and ViewportPointToRay not working with VR camera
X-posting from the Unity VR forums: https://forum.unity3d.com/threads/screenpointtoray-and-viewportpointtoray-not-working-with-vr.471440/ Environment: - Unity 5.6 - VR is active and rendering to the HMD My goal is to raycast from the mouse into the scene, with a VR camera. When using ScreenPointToRay on a VR camera the ray's direction is way off. Here's an image when my cursor was in the center: When using ViewportPointToRay the ray's direction is much better but still not perfect. Here's an image showing the difference (sorry for mouse pic, my screen capturer wasn't picking up the mouse): Here's the test code to reproduce in a clean Unity 5.6 project and scene: using UnityEngine; public class Raycaster : MonoBehaviour { public Camera sourceCamera; public bool useScreenPoint; public bool useViewport; public Transform viewportSphere; public float sphereDistance; private void Update () { Vector3 mousePos = Input.mousePosition; Ray ray = new Ray(); if (useScreenPoint) { ray = sourceCamera.ScreenPointToRay(mousePos); } else if (useViewport) { float normalWidth = mousePos.x / sourceCamera.pixelWidth; float normalHeight = mousePos.y / sourceCamera.pixelHeight; Vector3 viewMousePos = new Vector3(normalWidth, normalHeight, 0); ray = sourceCamera.ViewportPointToRay(viewMousePos); if (viewportSphere != null) { viewportSphere.transform.position = ray.origin + ray.direction * sphereDistance; } } Debug.DrawRay(ray.origin, ray.direction, Color.red); } } In these screenshots I'm using a single camera set to both eyes. However, I have repro'd the same problem when I switch to left and right eyes too (that's the setup in the actual project I'm working on). To visualize the use case: there is one person in VR and one person not in VR but on the PC using the mouse to click on things the VR person is looking at. The simple solution to this would be to use another camera that displays just to the monitor, however the project is very performance restricted and the new camera would be displaying exactly the same thing as the existing VR camera. Has anybody had to tackle this or have any ideas on how to solve this without an extra camera?2.4KViews0likes0Comments