Unity OVRInput.Get() caching for performance?
In Unity, I normally cache any GetComponent() if I plan on using it within the Update(). Is the OVRInput.Get similar? Everyone online seems to just put code like OVRInput.Get(OVRInput.RawButton.X) directly in Update() and even inside their IF statements. Is this the recommended method for performance? Is the OVRManager handling everything and I just need to call the OVRInput.Get and not worry about it. What about if I am using that OVRInput.Get in multiple if statements in my script? Should I assign the OVRInput.Get as a boolean at the beginning of the Update() and then use that in my script or would that only really help readability but not have an impact on performance? Thanks!Solved653Views0likes2CommentsMeta Quest 2 Controllers Acceleration
Hello, I'm trying to access linear and angular acceleration of Meta Quest 2 controllers. Here's the code I'm using: OVRInput.GetLocalControllerAcceleration(OVRInput.Controller.RTouch); It always returns vector with all components set to 0 for both controllers. The velocity/position works well. I'm using Oculus Integration package v47.0 and Unity 2021.3.17f1.2.7KViews2likes3CommentsWhich input system to use with Movement SDK
I'd like to experiment with the Movement SDK. The readme indicates Oculus Integration is a prerequisite.and that package uses the new input system (UnityEngine.InputSystem). The Movement SDK, however, is uses the old input system (UnityEngine.Input). Which system should I use?596Views0likes0CommentsOVR Input and OVR Camera Rig
I have an Mac Studio with Unity 2021.3.2f1 installed - with Oculus integration. It is actually not possible to use the OVR Input prefab because the OVR Camera reference is greyed out. There is no possibility to drag the OVR Camera Rig prefab into it as mentioned in the document. Does anybody have a workaround for that?1.8KViews0likes0CommentsOVR UI Helpers - Raycast LaserPointer and cursor not displayed in Quest build from Unity 2019.3.0f1
Hi all, really hoping someone might have experienced this same problem and found a fix, as it's really frustrating to not understand WHY this is happening. I'm Using Oculus Link and Unity to iteratively develop and test (using the play button in unity, which automatically plays on the USB connected Quest). I'm following the guidance for using UI helpers on a canvas object and have had no problems with being able to interact with all aspects of my UI using the touchcontrollers and the included raycast and linerender "Laser Pointer" script. The problem i'm having is when i come to do a build. Everything builds fine, with no errors but when I play it on the quest, the laser pointer/ line renderer not only doesnt show up it doesnt give me any interation with the UI when running on the Quest. Even stranger is the fact that when I plug my Quest back into the PC and start developing again in Unity, the problem then appears there. None of the UI interaction is working and the line renderer doesnt show up !? The only way to get the functionality back is to exit Unity and reload the scene. Everything then goes back to normal.... until I do a build again!? This may well be a Unity issue, I'll be asking kindly on their forums too, but would appreacite any advice or if anyone else has experienced these kinds of UI interaction problems with a Quest Build? Thanks in Advnace Andi7.9KViews0likes5CommentsOVRInputModule primary index trigger is unreliable.
Hi. I am having an issue wher the OVRInputModule is not working with me properly when I try to use the Primary Index Trigger. I have UI elements in my game including toggles and sliders. If I use the A button, I am able to interact with these widgets successfully. However, when I use the primary index trigger, it is hit or miss whether or not my input is registered. For the "Joy Pad Click Button", I have "One" and "Primary Index Trigger" selected. I did put some breakpoints in the function virtual protected PointerEventData.FramePressState GetGazeButtonState() and it does seem like the pressed and released variable are toggled on and off properly when I use the A button or the trigger on the right controller. Any ideas? Thanks, John Lawrie2.7KViews0likes2CommentsOVRInput.Get(OVRInput.Axis2D.SecondaryThumbstick) always returns (0,0)
Hi. I am developing a Unity game for the oculus quest. I have an issue where I always get a value of (0, 0) returned when I query the value of the left thumbstick. When I query the value of the right thumbstick, I get real values if isLeftController is set, I have never been able to get a value other than (0, 0). If it is not set, it correctly gives me the value for my right thumbstick. if (isLeftController) { axes = OVRInput.Get(OVRInput.Axis2D.SecondaryThumbstick); } else { axes = OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick); } Any idea why this might be? Thanks John Lawrie1.8KViews0likes0CommentsUsing OVRInputModule with 2 controllers.
Hello. I am trying to figure out how to use the OVRInputModule in my game with 2 controllers. I want to have 2 laser beam that comes out of each controller, very similar to how the UI works in the dashboard. However, I can only get one laser to move and be attached to its controller at once. This is my hierarchy. And these are the details of my EventSystem In the OVRInputModule, if the RayTransform and Cursor are set to the ConquestOVRLaserPointer that is a child of the RightControllerAnchor, the laser attached to the right controller will move when the right controller does, but the left laser will not move. If, on the other hand, I set the RayTransform and Cursor to the ConquestOVRLaserPointer that is a child of the LeftControllerAnchor, then the laser attached to the left controller will move when the left controller does and the right laser will not move. Since there is only one on RayTransform and Cursor slot, how do I get it so that the lasers attached to both controllers can move at once? Thanks for any help John LawrieSolved4.1KViews0likes2Comments(press and hold) oculus home button from within Unity Application?
Hello, I'm developing for Quest 2. I am wondering if there's a way to automatically press and hold the Oculus Home button (on right hand controller) from within the app. Rather than bringing up the Oculus home menu, I need to do this to reset the position of the user in the virtual environment, after they have wandered away from a central location (this game utilizes 6 DOF and is carried out standing up). For some context - users are moved along 'tracks' within the game by gazing in certain directions. They should only be able to walk away from these tracks for very short distances - hence the necessity to reset position. In the documentation that describes the Map Controller inputs, it looks like the Oculus home button is reserved. If it is not possible to access it within Unity, is there any other way around this (to achieve this spatial position and orientation 'resetting' function?)Wondering if I may need to resort to a custom function. Thanks in advance!1.4KViews1like0Comments