Standalone Vulkan in headset not working correctly - Huge Flickering, Incorrect frames
Hello, I'm working on a game with The Mirror (https://www.themirror.space/) , we've spoken to support at Godot Engine and they have mentioned that the Quest 3 has a Vulkan issue where the headset flickers intensely. We are experiencing this issue but only on the meta quest 3, on Mac, Windows, Linux and the quest 2 we don't have the same rendering problem. I have made a video of the issue please see the video https://www.youtube.com/watch?v=gd4NBKl0Gro The team at Godot Engine have tried to resolve this problem before for quite a lot of users but haven't been able to locate the source of the issue apart from it seems to be driver related for the adreno 740 in the meta quest 3. I am not a driver/rendering expert and would appreciate any help with this issue. The API is using Vulkan 1.0 and Vulkan 1.1, it has Vulkan 1.2 available too. I can provide full source code. Thanks, Gordon MacPherson927Views1like2CommentsIssues with APK Build Rendering and Body Tracking on Quest 3 in Unity
Hello everyone, I'm currently working on a project in Unity for the Meta Quest 3, and I'm facing several challenges that I hope some of you might be able to help me address. Rendering Issues in APK Build: When I run my scene directly from Unity on my PC connected to the Quest 3 via Link, everything renders correctly. However, when I create an APK and run it on the Quest 3, certain elements like water and body models do not render properly. It seems as though the body model is not visible and the water in the scene does not display at all. Body Tracking: Alongside rendering issues, the body tracking does not function as expected in the APK build. When connected to the PC, the body tracking works fine, but this functionality breaks down in the standalone APK. Configuration Details: Unity Version: Unity 2021.3.26f1 XR Plugin Management settings are configured for Oculus. I've tried various settings for Android texture compression and multi-view rendering. My scene includes body tracking and uses the OVR plugins. I've attached the settings I'm using for Android build and XR plugin management for reference. Has anyone else encountered similar issues or can offer any insights into what might be going wrong? Any suggestions on how to ensure the APK build maintains fidelity with the editor preview would be incredibly helpful. https://youtube.com/shorts/BzQq_aj-mfo Thank you in advance for your help!530Views0likes0CommentsUnity Build extremely broken
Hello there, I've been making a small VR scene in Unity. Everything is fine when im playing in the editor but when I make a build and load it through the Meta Quest Developer Hub it looks like the pic below! I've tried disabling the lightmaps and make everything realtime cause I thought that could have been the issue but it didn't fix anything. I can't even really google it because im not sure what is happening and every search I already tried didn't really help. I want to say that some months ago, like in, september, i was able to build the scene just fine. I added some extra models and an extra interactable, but nothing else that could suggest breaking the build like that. Also im using the v56 of the occulus integration package because that's the package i had when i started the project and i didnt want to break anything by upgrading (lmao). Any ideas? Also an extra problem, even though the game is only using hand tracking the Oculus is forcing me to use controllers and wont even let me launch the app. Edit: I upgraded the project to the latest package (Meta XR all in one) but unfortunately it didn't help Edit 2: I finally realised the problem was with an ice shader i was using for some ice cube in front. I was literally rotating and when it was going out of view everythign looked fine. Here is the shader i was using, which was working perfectly fine while testing through Air Link mind you. Idk why but it breaks the rendering when i have it. I will look further into it on my free time but if anyone has a guess feel free to reply https://github.com/lindenreid/Unity-Shader-Tutorials/blob/master/Assets/Materials/Shaders/ice.shaderSolved1.2KViews0likes1CommentUnity OVROverlayCanvas appears as zoomed in when using Dynamic Resolution
We are trying to optimize our game using Dynamic Resolution. We followed this document: Dynamic Resolution: Unity | Oculus Developers When we were trying to fix the last known issues in the document (Additional RP when scaling != 1), we found that all the OVROverlayCanvas were broken showing larger images and texts as if they were zoomed in. Had anybody met the same problems with us? Are there any solutions for it? Unity Version: 2021.3.32f1 Core SDK Version: 62.0.0552Views0likes0Commentsflipbook in oculus quest does not render properly
Dear All, I have a very basic material being rendered on a spheric mesh, which appears correctly at the beginning but starts changing in a recursive manner by increasing the size of the rendering mesh like this video: youtu.be/7s9cxypgUWE Here is the flipbook I used: Any comment is highly appreciated.840Views0likes1CommentSmooth Portals rendering in Unity / Oculus Quest
Hello everyone, i'm working on a game that need the player to be able to go through different portals. I want the portals to render the otherside. I found the exact tutorial that i needed from Brackey's channel : Smooth PORTALS in Unity. But because i'm working on VR, his project is not enterily working. The problem is that the view in the portals have distortion (Like if i had both Left and Right eyes rendering at the same time at the same point). I think it's because i'm only using the Center eye of the OVRCameraRig, and i'm not using the Left and Right eyes, neither using a render texture on theirself. But when i tried it didn't worked too. When i try to put several OVRCameraRig in the scene (in replacement of his Camera in his videos), my OVRPlayerController stop working (can't move my hand nor move myself). I dont know where to search neither where to begin to solve that problem. Thanks in advance for your help.1.4KViews1like1CommentHow does Remote rendering support for Unity works?
v13 Release notes: "Remote rendering support for Unity has been added so developers can more easily build and test Hand Tracking apps through their PC" I can't find any documentation on this. First I thought it would work like android remote rendering. By selecting the the quest under Project Setting / Editor / Unity Remote Device. But for that to work I need the unity 5 remote app from the play store on the quest. Can we please get a little push in the right direction? Building every time is so time consuming.2.7KViews1like3CommentsKernels/Convolution Shaders (Blur, Bloom, Edge detection, etc.) on the Quest
So i'm working on converting my rift apps to the quest, and it's been challenging to say the least. I'm working through a lot of performance optimizations but one in particular brings me here to seek wisdom: mobile gpus render tiles one at a time, which makes convolution operations - like gaussian blur, edge detection, bloom, etc. - very bad. This is because when you sample neighboring pixels, those samples are sometimes outside of the current tile. i'm wondering if anyone has found ways to efficiently perform blurs, blooms, and other kernel convolution operations on mobile? some of my apps depend on grab pass blurs. I'm working in Unity 2018.3.x at the moment.2.1KViews0likes1Comment