XR Tracked Keyboard not being tracked in Unity Scene
Hi All, I was wondering if anyone has come across any similar issues with trying to integrate a tracked keyboard within their scenes. I am trying to connect the Logitech MX Mini Keys keyboard and have it appear in my scene. I am using a Meta Quest Pro and Unity 2022.3.19f1. I have installed the Meta XR All in One SDK. I have followed the steps outlined here Integrate OVRTrackedKeyboard Prefab: Unity | Oculus Developers I have run into a few issues, and understand from Meta that passthrough hands are unavailable during debugging. My keyboard does not seem to appear but only sometimes as a black rectangle that cannot be moved and I keep getting the following warning symbols "local dimming feature is not supported" "d3d11: Creating a default shader resource view with dxgi-fmt=98 for a texture that uses dxgi-fmt=99" "[OVRPlugin] XR_PASSTHROUGH_LAYER_PURPOSE_TRACKED_KEYBOARD_HANDS_FB is not yet supported on windows (arvr\projects\integrations\OVRPlugin\Src\Util\InsightMrManager.cpp:959)" "[OVRPlugin] Failed to create a passthrough layer with error -1 (arvr\projects\integrations\OVRPlugin\Src\Util\InsightMrManager.cpp:220)" "Invalid state passed into TrackedKeyboardVisibilityChanged ErrorExtensionFailed UnityEngine.Debug:LogWarning (object) OVRTrackedKeyboardHands:TrackedKeyboardVisibilityChanged (OVRTrackedKeyboard/TrackedKeyboardVisibilityChangedEvent) (at ./Library/PackageCache/com.meta.xr.sdk.core@62.0.0/Scripts/OVRTrackedKeyboard/OVRTrackedKeyboardHands.cs:528)" "d3d11: failed to create 2D texture shader resource view id=1930 [D3D error was 80070057]" I have tried to redo the whole thing in case I missed a step. I have also tried not having passthrough enabled but that throws other errors. I had to add a Surface projected Pt Plane for the passthrough object to stop throwing errors. I followed this tutorial for this Surface Projected Passthrough Tutorial: Unity | Oculus Developers I am finding it hard to try and troubleshoot this issue as everytime I add print statements in the code, when it refreshes, it redownloads metas original code so I am unable to modify it. I was wondering if anyone had any similar issues or any advice on how to tackle this. I have tried to publish my app so i can see if the issue is simply fixed by deploying it but that can take up to a week for it to be approved. So in the mean time if anyone has any ideas on how to go about this, i will be very grateful. I am willing to try anything at this point :))770Views1like0CommentsPerformance Issue - VR Unity
Hi all, I am trying to finish my final year project in VR built in Unity and one of my scenes is not loading when clicking on the button menu (Start). This scene is loaded with different assets and in fact, is the only one that would not open and let my app crash. I have tried to remove a few assets but unfortunately, I am still experiencing the same issue. Any thoughts or suggestions? I really can't remove more assets from the scene as it would match the design I wanted to recreate. metasupportcase MetaStoreHelp https://youtu.be/HKHHjwHlRns573Views0likes0CommentsMenu Canvas Not Responsive
Details Unity Version: Set-Up: Meta Integration: Meta XR All-in-One SDK Using Oculus Quest 2 Hello! I am not new to VR development but I am for some reason struggling to build this new app I am currently working on. I have three issues. 1. I am trying to add a Menu/intro scene in my app but the Canvas I have created is not responsive. The UIHelpers are working on hit-target, but the buttons are not. E.g., a button that should bring you to the start scene does not work. I have checked and re-checked the script and all the steps of this tutorial that I used in the past for other projects, but I cannot understand why it isn't working. 2. The Menu Scene is not showing when loading the build - I am automatically redirected to the main scene. It only works when I remove the Main scene from the list of the scenes to load. Any suggestions for this issue? 3. Grabbable props I am using are not showing in the scene. I am following the official Meta guideline to build this app, although it should be simpler to create experiences in software using the Meta XR All in One integration I am finding it extremely frustrating as it is the second time I had to start from scratch. Shall I follow a more traditional approach and follow the Unity VR tutorials or is there anyone who can advise me on how to create a VR app without encountering these many issues? Thank you. MetaStoreHelpSolved1.5KViews0likes1CommentOpenXR and Meta XR SDK at the same time
When I install OpenXR Plugin alongside Meta XR All-in-One SDK I get this warning in Project Validation to use Oculus XR Plug-in instead of OpenXR. What is the recommended/correct approach to achieve cross-platform compatibility with OpenXR while using Meta XR SDKs in that case? Or should I ignore this warning?1.6KViews3likes0CommentsQuest Only Renders Right Eye
Context: I recently upgraded to Unity (2020.3.30f1). I am using Oculus XR Plugin (1.11.2) and XR Plugin Management (4.2.1). I am using OpenGLES3 Graphics API, Multi Pass Stereo Rendering Mode, using the Default Render Pipeline. Issue: My build appears to render the right eye correctly. However, the left eye appears to show a single color sampled from somewhere on the display. As I turn my head I can see various colors from the scene in my left eye. It's as if my left eye were magnified down onto a single pixel. This issue even occurs during Unity's initial loading splash screen, before the scene's OVR Camera is even enabled. It also occurs in my custom made splash scene before the main scene loads. If I walk to the edge of my chaperone guardian I can see the boundaries with both eyes. If I open the Oculus menu I see it with both eyes. No errors in the editor. No errors in the build. I've removed Post Processing V2 from my Plugin Manager and anything dependent on it. I have tried changing to Multiview (Single Pass) stereo rendering, but then nothing renders in either eye. I have also tried enabling "Use Per Eye Cameras". No dice. I noticed this was not an issue when I switched to using OpenXR. However OpenXR does not support hand tracking to my knowledge, which it a critical necessity for my project. If anyone has had issues similar to this I'm open to suggestions.12KViews4likes10CommentsFPS is gradually decreasing on Quest 2
I'm working on a multiplayer vr app for quest 2. There is a starting scene and a common room scene where all players join. Movement around the stage is carried out using teleportation. The player is represented by XR Origin component with XR Controller components. I noticed that after some time in the application, FPS drops. After about 10-15 minutes, FPS drops from original 72 to 45 - 50, and is no longer restored. Even going to the starting scene does not help, and only restarting the application helps to return the fps to the original 72. Running the app in host mode doesn't help either. The profiler shows that the resource consumption of the EarlyUpdate.XRUpdate method is increasing more and more, but I can not understand what this is connected with. I would really appreciate your help in solving this problem! Unity version: 2021.2.6.f1 Oculus XR Plugin: 1.11.3 XR Plugin Management: 4.2.11.4KViews2likes1Comment