(Unity) OVRPlayerController- How to get OVRPlayerController to move with OVRCameraRig
I'm working off the standard OVRPlayerController, which has an OVRCameraRig as its child. This is a game where I need thumbstick locomotion (which the OVRPlayerControllerProvides), but I also need roomscale movement. In other words, I need to make sure that when the player moves physically, his in game avatar should also move. Currently whats happening is that when the player moves physically, the OVRcameraRig moves with him, but the parent OVRPlayerContoller does not move. This is an issue because I need my OVRPlayerController to move with my player at all times for proper collision tracking and targeting by Hostile AI. What is the best way to achieve this? Iv'e tried a few ways to make it work but wondering what the cleanest solution is. I'll also need hand tracking for this game. Perhaps I should simply use the AvatarSDK standard avatar and make it a child of a Character Controller for thumb stick movement? thanks for the help!21KViews1like11CommentsOVRManager.display returning null
Problem: I'm trying to set my Quest 2 headset to 120hz, Calling OVRManager.display returns null Details: I imported the Oculus Integration Plugin then wrote the below code, but when I build and deploy to the headset I get a null reference exception because OVRManager.display returns nulls Additional Info: Unity 2021.3.12f1, building for the Quest 2, I have Quest support disabled. Oculus XR Plugin V3.0.2 XR Plugin Management V4.2.1 Oculus Integration V49.0 BNG VR Interaction Framework 1.82 Thanks in advance for any help!Solved2.8KViews0likes3CommentsWhere has the Oculus/OVR Build menu gone?
Where has the Oculus/OVR Build menu gone? I am working in Unity 2022.3.11f1 and wish to connect my Quest 3 but the option to go to "Oculus/OVR Build/OVR Build APK And Run" on Unity menu bar has disappeared. How do I connect my Unity Project to Quest 3 please?430Views0likes0CommentsMenu Canvas Not Responsive
Details Unity Version: Set-Up: Meta Integration: Meta XR All-in-One SDK Using Oculus Quest 2 Hello! I am not new to VR development but I am for some reason struggling to build this new app I am currently working on. I have three issues. 1. I am trying to add a Menu/intro scene in my app but the Canvas I have created is not responsive. The UIHelpers are working on hit-target, but the buttons are not. E.g., a button that should bring you to the start scene does not work. I have checked and re-checked the script and all the steps of this tutorial that I used in the past for other projects, but I cannot understand why it isn't working. 2. The Menu Scene is not showing when loading the build - I am automatically redirected to the main scene. It only works when I remove the Main scene from the list of the scenes to load. Any suggestions for this issue? 3. Grabbable props I am using are not showing in the scene. I am following the official Meta guideline to build this app, although it should be simpler to create experiences in software using the Meta XR All in One integration I am finding it extremely frustrating as it is the second time I had to start from scratch. Shall I follow a more traditional approach and follow the Unity VR tutorials or is there anyone who can advise me on how to create a VR app without encountering these many issues? Thank you. MetaStoreHelpSolved1.5KViews0likes1CommentSeeking Guidance on XR interaction SDK sample scenes
Hello. I'm trying to figure out Interactions with the Meta XR all in one SDK. I can pick up a cube (woo!), but I cannot grab a sphere at a distance. I was following what I could find around the Interaction SDK and recently downloaded the SDK for XR-Interaction-sdk-ovr-examples, which I took at face value when it stated, "Contains sample scenes, prefabs, and art assets for Interaction SDK, using OVR variants of the player rig.". I cannot find any scenes in the package that I can successfully import to my Mac project. Can anyone steer me right? Are scenes just not included in this release? The App Lab project works great and I want to decompose the scenes used here to make sense of it for my project, which is suffering from a lack of clear documentation. Thanks for any help on this.Solved4.3KViews0likes3CommentsUsing OVRHand for remote hands in editor: issue in OVRSkeleton and OVRMesh
Hi ! I'm using OVRHand (in addition to OVRSkeleton, OVRMesh and OVRMeshRenderer) to display finger tracking locally, and then to display the same for remote users, in a multiplayer app. It works well (for remote users, I made an equivalent of OVRHand but which handle serialized data of the local hand), but I have one small issue so that it is fully clean. Currently, for remote hands to appear, the local user hands haver to appear once. It is due to a check in the initialization code of OVRSkeleton and OVRMesh, in their ShouldInitialize method, that only occurs in the editor: if we are in the editor, in addition to all the normal checks, the scripts checks that OVRInput.IsControllerConnected(OVRInput.Controller.Hands) is true before initialzing, blocking the hand rendering for remote users using their hand if the local user is not using their hand. In a built app, it will work, and I can fix easily this localy by editing this editor only check. But when I'll get the package through UPM, it won't be editable so easily. So I'm wondering what would be the proper feature request process to ask for a virtual protected version of ShouldInitialize in OVRSkeleton and OVRMesh, or any better option, to be able to customize this point ? Thanks !902Views1like0Comments[Unity] Virtual Keyboard feature not working
Greetings. Issue: I and my team attempted to enable Oculus overlay/virtual keyboard but failed. Related links/attempted approaches: Fix attempts by chronology: Trigger VK via Input Field in the old project's game scene Trigger VK via code in the old project's game scene Trigger VK via Input Fields/Buttons in Oculus VirtualKeyboardSample scene in the old project Trigger VK via code in Oculus VirtualKeyboardSample scene in the old project Trigger VK via previous methods, after updating Oculus Integration/OVR Trigger VK via all previous methods in the same project, but migrated in the new repo Created blank Unity project, imported/configured VR-related packages, Oculus Integration/OVR, tried to trigger VK via Oculus VirtualKeyboardSample scene Most of the attempts resulted in this error log: [OVRPlugin] [CreateVirtualKeyboard] m_XR_META_virtual_keyboard extension is not available (arvr\prjects\integrations\OVRPlugin\Src\Util\CompositorOpenXR.cpp:12543) Create failed: 'Failure_InvalidOperation'. Check for Virtual Keyboard Support. Unity version 2021.3.26f1 LTS Oculus XR Plugin version: 3.3.0 Oculus Integration versions: 54 and 55 OVRPlugin: 1.86.1 Important note: We tried to run the sample, on the same branch/commit, but on different devices/users. 2 out of 6 developers got the keyboard working as intended, others received the error log mentioned above. Seems very strange to me After all attempts, it looks like there's a problem with the OVRPlugin .dll, and not a problem from our side. Any help/suggestions/tips/solutions would be highly appreciated. Expected result: We can access Oculus Overlay/Virtual keyboard after interacting with text UI elements in VR to edit text.Solved5.4KViews1like3CommentsOVRCameraRig use per eye cameras changes scale
I'm working on a 3D image/video solution so I'm using the 'per eye camera' option in the OVR Camera Rig in Unity. I noticed that if I toggle this setting on and off, the scale of the world is radically affected. With the per eye camera setting off, the eye separation appears correct, but when I switch it on, it seems to jump. I don't seem to have any control over this and so it is completely breaking my project. I'm using Unity 2019.3.2.f1 on OVR 1.42.0. This issue occurs when using Oculus Link on the quest from the editor and also in the Quest build.5.5KViews0likes7CommentsOVRSpatialAnchor not appearing to save at/load from the right place in world space
I'm trying to implement a locally-saved spatial anchor system, but every time I try to load a saved anchor and instantiate an anchor prefab to the previously saved location it just ends up loading at some point close to (but never quite at) (0,0,0) instead of wherever I actually placed the anchor. Would love some help figuring this out because I have no idea where I went wrong 😞 Save code: // save anchor locally anchor.Save((anchor, success) => { if (!success) { return; } // save anchor to player prefs (persistent) PlayerPrefs.SetString("main_uuid", anchor.Uuid.ToString()); }); Load code: var main_uuid = new Guid(PlayerPrefs.GetString("main_uuid")); var uuids = new Guid[1]; uuids[0] = main_uuid; Load(new OVRSpatialAnchor.LoadOptions { Timeout = 0, StorageLocation = OVRSpace.StorageLocation.Local, Uuids = uuids }); private void Load(OVRSpatialAnchor.LoadOptions options) => OVRSpatialAnchor.LoadUnboundAnchors(options, anchors => { if (anchors == null) { return; } foreach (var anchor in anchors) { if (anchor.Localized) { _onLoadAnchor(anchor, true); } else if (!anchor.Localizing) { anchor.Localize(_onLoadAnchor); } } }); private void OnLocalized(OVRSpatialAnchor.UnboundAnchor unboundAnchor, bool success) { if (!success) { return; } var pose = unboundAnchor.Pose; var spatialAnchor = Instantiate(anchor_prefab, pose.position, pose.rotation); unboundAnchor.BindTo(spatialAnchor); }1.7KViews0likes2Comments