Object placed above MRUK furniture "jumps" up/down when pushing right thumbstick
Context Unity + Meta XR Building Blocks. I'm building an AR app (Passthrough + MR Utility Kit). I show a world-space UI dialog above an MRUK table (Canvas in world space, RectTransform placed at the table center + slight lift to sit on the surface -> all via script). Symptom Whenever I push the right controller thumbstick downward, the dialog appears to "jump" ~0.5 m up, and pushing again makes it jump back down. This happened both on the device and in the Simulator. What it actually is It's not the dialog moving. Logging showed Camera.main.transform.position.y toggling between two values (1.047 <-> 1.547), while the dialog's world Y stayed constant.Solved46Views0likes1CommentRender Camera Switching Forcefully in Screen Space - Camera with Pointable Canvas
When using Pointable Canvas with Screen Space - Camera on a Canvas, the Render Camera is forcibly switched to the Event System Camera instead of the intended camera (CenterEyeAnchor) This issue only appears during Play Mode and prevents the UI from rendering properly. Despite setting the correct camera, the Pointable Canvas overrides the Render Camera settings. I don't want to switch to World Space because I need the UI to always render in front of the player. How can I fix this issue while keeping the Screen Space - Camera setup? (^^ This is the camera that it automatically generates.)305Views0likes0CommentsOVRPlugin.GetSystemHeadsetType() doesn't return Oculus_Quest_2
I added the OVRController prefab to my project but on Quest 2 it always shows the controllers for te Quest. Thi is because OVRPlugin.GetSystemHeadsetType() always returns Oculus quest even if I'm using Oculus Quest 2. I'm working with the latest version of th Oculus Integration. Does anyone have the same problem?1.8KViews1like3CommentsVideo Players bugged on Oculus Quest in Unity
Hi! In my game I have a few quads with the video player component on them. They are all playing different videos at the same time but when I start the game on the Oculus Quest it is as if all players are playing all clip at the same time as the quads are flickering between all clips. They are working fine in play mode in the editor and in a windows build. Has anyone else had this problem before or knows how to fix it? Here is a video of it: https://youtu.be/qYay41Kqxlc2.3KViews0likes3CommentsOVROverlay not rendering in correct order with depth buffer
Hi everyone, I'm trying to take advantage of the improved image clarity that OVROverlay provides compared to standard Unity UI to render interfaces. I have a scene with multiple tablets that the user can grab and interact with, and since these can be moved arbitrarily using grab manipulation mechanics, I need to make sure that they a) intersect correctly with the scenario and b) intersect correctly with each other or at least can have their sorting order changed at runtime depending on which one is in front. I already managed to make it work by using OVROverlayCanvas and having OVROverlays in underlay mode. But I cannot get the rendering to meet all conditions: If I set overlay.noDepthBufferTesting to false, I get correct intersections between the two tablets but they will always render on top of the scenario. There is no depth interaction between the overlays and the 3D. If I set overlay.noDepthBufferTesting to true, which the OVROverlayCanvas component does by default, I get nice intersection between the tablets and the scenario, even correct transparencies. The problem is that the tablets don't render correctly when they overlap because one gets rendered always before the other. Using compositionDepth doesn't fix anything because for some reason, the internal call OVRPlugin.EnqueueSetupLayer() only seems to work the first time the depth is assigned, but not on subsequent calls. I added code that calls OVRPlugin.EnqueueSetupLayer() to be able to change the depth but the render order remains the same. Ultimately this call goes to OVRP_1_28_0.ovrp_EnqueueSetupLayer2(), which returns Result.Success. Is this a bug? As a last resort I tried to destroy and re-create the layer with the new depth value and this seems to work, but it's not something I want to add to production code :) So basically my question is if there is a way to make overlays intersect with the scenario and at the same time intersect correclty with each other. Kind regards, Enrique, VRMADA.2.5KViews0likes1CommentAvatar Hands not showing in Built Game
Hello Guys, first of all i am using Unity 2020.1.13f1 with the Oculus XR Plugin Version 1.5.0. I'm having issues with the Local Avatar Hands. When starting the game in the Unity editor everything is fine. But in the built version of the game the hand meshes don't show up, even though i can still use them normally. Now i have looked the problem up and found simular threads where the hands don't show up at all. But the fact that in my project the problem only occures in the built version really confuses me. In my scene i am using the OVRCameraRig from Oculus Integration with a LocalAvatar as a child of the trackingspace. If anyone has faced the same problem or knows something about it, i would appreciate any advice. Obviously i am available for further information. Thanks!777Views0likes0CommentsHDRP Unity only render one eye
Hi guys, I am having a problem bothering me for weeks. I have been using Unity 2019.4 hdrp to build a scene for my research. I'm using Oculus Quest via Link. However, everytime I try to build the game, it always ended up with only render the left eye while the right is totally black. I have tried using built in XR plugin and XR management, removed post-processing package and tried both multipass and single pass instance mode. None of them works. Just wondering if anyone have same problem or ideas to fix it? Thank you so much.1KViews0likes0CommentsMySql Connection Works in Unity but not form my quest
I have a quest headset where I;m developing an application that needs to use MySQL (I know this isn't the preferred SQL database for many reasons, but it is my only option). In unity I have the connection properly working and am able to post and receive information from the database, but I am unable to do so from the headset. I have Internet permissions required, but can't for the life of me figure out why the application is not connecting with the database when run stand alone. The database is not local so Its not one of the obvious localhost problems. Any help is appreciated!1.8KViews0likes3CommentsD3D11 Error
Hi there ! I'm currently making a VR Game for some events with the Oculus Rift S. All software version are : Unity v2019.1.4f1, Oculus Utilities v1.28.0, OVRPlugin v1.28.0, SDK v1.38.0. On our last demo, after running few hours, 4 of our 5 computers (same soft and hardware on all computers) crashed due to the same error ("D3D11: Failed to create RenderTexture (1504 x 1616 fmt 19 aa 8), error 0x8007000e"). In my log file, I can find this error too: "(Filename: C:\buildslave\unity\build\Runtime/GfxDevice/d3d11/TexturesD3D11.cpp Line: 442)" DynamicHeapAllocator allocation probe 2 failed - Could not get memory for large allocation 27859184. d3d11: failed to create 2D texture shader resource view id=433 [D3D error was 80070057]" A stack trace in log gave me this pretty line : C:\Users\depinxi\Desktop\dist\Joute_Data\Plugins\OVRPlugin.dll:OVRPlugin.dll (00007FF8D0AB0000), size: 5017600 (result: 0), SymType: '-deferred-', PDB: '', fileVersion: 1.28.0.0". My game needs to run between 8 and 10 hours/day... Do you have any idea how to solve this? Probably a memory leak? Thank you for reply!1.1KViews0likes0Comments