Enable hand and controller tracking at the same time.
Hi, I have a Oculus Quest Pro, and work on a Unity project that needs hand tracking and controller tracking for a physical object, but I can't enable hand and controller tracking at the same time. So I wonder is this possible? or is there any other ways to track a physical object using Oculus?12KViews5likes15CommentsUnity App Build Help!! , Lasers are not appearing in app!!!
I’m currently working on an project intended for the Meta Quest 3 headset, using Unity to develop an application with an immersive UI main menu. This menu is designed to be navigable via controllers, utilizing a laser pointer system for scene selection. I coded the lasers. Testing the project directly on my PC with the headset connected, everything functions as expected: the laser pointers emit from the controllers, allowing for seamless scene selection. However, an issue arises when I build and use the app. After building the project with Android settings and installing it on my headset through the Meta Quest developer hub, the lasers mysteriously fail to appear, rendering scene selection impossible. This problem persists despite the app’s successful installation and launch, suggesting a discrepancy between the Unity editor’s behavior and the built app’s performance on the device. The built settings are set for Android I have lasers scripted I am using OVRCameraRig I have the lasers scripted to use on OVRControllerPrefab, which is a child of RightHandAnchor The difference in behavior between the Unity editor tests and the actual device execution hints at a deeper issue, possibly related to how the build process handles VR-specific input or rendering settings. I’ve attached images showcasing the current OVR setup and the project settings fix recommendations I’m reaching out to the community for insights or advice on addressing this issue, hoping that someone with experience in Unity VR development for Meta Quest devices can shed light on this perplexing problem and suggest potential fixes.788Views0likes0CommentsDeveloper Distribution Agreement?
Hello, I have developing a game with Unity Engine for Oculus Rift S. When I use OVR Platform tools i get a error message (SS1) that tells me i have to agree developer distribution agreement but i couldn't find such a thing. Could you help me? SS1 unity 2019.4.1f1 Ovr plugin 1.52.14.1KViews0likes7CommentsOculus doesn't stream to external monitor when built
Hi, i've faced this problem for about a month now, i am developing for oculus rift s, basically the game render to monitor in unity editor, but when built, the display just freeze at the last frame before entering VR mode, but it works normally in the HMD, just not streamed to the monitor, please help answer if you know solution as i've tried many but haven't found the solution or even the cause yet, thank you575Views0likes0CommentsOVRHaptics.Process() throwing exception when removing headset (v1.28.0 / SDK 1.39.0, Unity 2019.1)
Hello, Getting this exception being thrown in the console when removing a Rift S headset... This is happening in the Unity Editor, using Unity v2019.1.11f1, Oculus Utilities v1.28.0, OVRPlugin v1.28.0, SDK v1.39.0. Not using touch controllers, it's currently a passive presentation-type app. ArgumentException: start_index + length > array length Parameter name: length System.Runtime.InteropServices.Marshal.Copy (System.Byte[] source, System.Int32 startIndex, System.IntPtr destination, System.Int32 length) (at <7d97106330684add86d080ecf65bfe69>:0) OVRHaptics+OVRHapticsOutput.Process () (at Assets/ThirdParty/Hardware/Oculus/VR/Scripts/OVRHaptics.cs:243) OVRHaptics.Process () (at Assets/ThirdParty/Hardware/Oculus/VR/Scripts/OVRHaptics.cs:369) OVRManager.LateUpdate () (at Assets/ThirdParty/Hardware/Oculus/VR/Scripts/OVRManager.cs:1458) Screengrab of the camera rig prefab using the OVRManager component, pretty standard implementation:1.2KViews0likes1CommentObject grabbed event
Hello everyone, I'm making my first experiences with the Oculus Rift S and Unity. At the moment I'm working on a little sandbox for myself to get familiar how everything works. I'm using the Unity XR Managment und XR Interaction Toolkit. I would like to trigger some methods when a player has grabbed an object. I tried using OnSelectEnter Event of the XR Grab Interactible... but this event is triggered too soon. I need to know when the object is actually in the players hand. Has anyone a hint for me, what's the best way to do this? Best, Verdemis944Views0likes0Comments