Render to only one eye
Hello, I'm after a bit of information about who I should try contacting regarding making VR less GPU wasteful for my particular situation. I'm blind in my right eye so I was hoping that there could be a driver level option maybe that would allow users to say "hey, as much as I'd love 2 working eyes, it would be real nice if I can just render a black image to the right eye and benefit from better FPS in the left". I get that stereoscopic is better, given the choice, but for me it's not a choice I have so I may as well benefit from improved frame rates or better image quality. I was born blind in one eye so I've had plenty of time getting used to lack of depth perception, however, I can still take visual cues from surrounding environment on a virtual race track to know when to out brake you :wink: Back on track though (pun intended), would it be a driver level (pun unintended) development, or maybe SteamVR? Something where we do not have to rely on every game studio to provide it as an option, instead we can just set it and forget it at lower level. I have a degree in games programming, so I'd be happy to get my fingers dirty if I was pointed in the right direction, if Oculus isn't prepared to implement something like this. Thanks in advance, Kevin1.7KViews0likes3CommentsWhy not enable Fixed-Foveated Rendering for intermediate render texture?
I found Fixed-Foveated Rendering works well when I don't use any render feature or post process. But if I add any pass requires rendering into a intermediate render texture, then FFR is disabled for those render passes and only enable in FinalBlit pass. I have checked the document and it also said not recommend to use FFR in this situation:https://docs.unity3d.com/Packages/com.unity.xr.oculus@4.1/manual/index.html I also checked the Vulkan document and found nothing force me not to use FFR in render pass which uses intermediate render texture. Is there any options could force enable FFR or am I missing something which not allow FFR being used in this case?1.3KViews1like2CommentsGame doesn't render after returning from android permissions
We have two permission requests at the start of the game: * Spatial permissions * Bluetooth permissions Sometimes after returning to the game after accepting these permissions nothing is rendered, but the game is still actually working. Music & sounds play, hand tracking works and you can press buttons, etc (though you can't see them). When you bring up the system menu/game title screen through the meta system hand pinch gesture and then "resume" the game, everything starts rendering again. Another issue that can happen (which seems related) is that after accepting the permission it doesn't return to the game, it brings up screen you get when you bring up the system menu. You then need to tap resume to continue the game. In our case you immediately get asked for the second permission and it'll return to the system menu again. This happens slightly more frequently than the issue above. We don't get both issues at the same time. The issue replicatable 50% of the time. Any advice would be highly appreciated. Cheers, Steven, Fantail Games.611Views0likes2CommentsBlack grid on camera frame when displaying point cloud
I am developing an application that receives and display a point cloud. When I try to render a point cloud (both with normal mesh or MaterialPropertyBlock and GPU) I get what it seems to be a black transparent canvas that follows my head. This is very annoying. I tried to change materials and shaders, to disable shadows, but I didn't find a way to get rid of this problem.316Views0likes0CommentsFull Screen Pass Renderer Feature Not Working on Build
Hello, I'm trying to implement an Outline Post Processing shader. I've found a simple graph shader that uses "full screen pass renderer feature" and the video is here. In the project I'm utilizing Meta's All-in-One SDK with passthrough and Meta's Depth API. Everything seems to work fine in Scene and Game mode with link. However, when I build the project the objects shift around and go undesired places while disabling the passthrough. I figured that the outlines seem to be in the correct location but the rendering of the red cube is moving around. I have been trying to debug it but could not find the reason causing this issue. I am not sure if these screenshots would help but here there are anyways.Solved1.7KViews0likes3CommentsStandalone Vulkan in headset not working correctly - Huge Flickering, Incorrect frames
Hello, I'm working on a game with The Mirror (https://www.themirror.space/) , we've spoken to support at Godot Engine and they have mentioned that the Quest 3 has a Vulkan issue where the headset flickers intensely. We are experiencing this issue but only on the meta quest 3, on Mac, Windows, Linux and the quest 2 we don't have the same rendering problem. I have made a video of the issue please see the video https://www.youtube.com/watch?v=gd4NBKl0Gro The team at Godot Engine have tried to resolve this problem before for quite a lot of users but haven't been able to locate the source of the issue apart from it seems to be driver related for the adreno 740 in the meta quest 3. I am not a driver/rendering expert and would appreciate any help with this issue. The API is using Vulkan 1.0 and Vulkan 1.1, it has Vulkan 1.2 available too. I can provide full source code. Thanks, Gordon MacPherson910Views1like2CommentsMake all materials in all samples URP
Dear Meta staff, can we please make ALL materials URP in all samples for Unity? I don't know any person using Built-In Renderer Pipeline in their Quest projects anyways.... I think this will give a lot of value to the samples offered by Meta, without having to duplicate materials and re assign them cus the URP converter can't change some stuff in the packages folders..... Thanks!500Views5likes0CommentsIssues with APK Build Rendering and Body Tracking on Quest 3 in Unity
Hello everyone, I'm currently working on a project in Unity for the Meta Quest 3, and I'm facing several challenges that I hope some of you might be able to help me address. Rendering Issues in APK Build: When I run my scene directly from Unity on my PC connected to the Quest 3 via Link, everything renders correctly. However, when I create an APK and run it on the Quest 3, certain elements like water and body models do not render properly. It seems as though the body model is not visible and the water in the scene does not display at all. Body Tracking: Alongside rendering issues, the body tracking does not function as expected in the APK build. When connected to the PC, the body tracking works fine, but this functionality breaks down in the standalone APK. Configuration Details: Unity Version: Unity 2021.3.26f1 XR Plugin Management settings are configured for Oculus. I've tried various settings for Android texture compression and multi-view rendering. My scene includes body tracking and uses the OVR plugins. I've attached the settings I'm using for Android build and XR plugin management for reference. Has anyone else encountered similar issues or can offer any insights into what might be going wrong? Any suggestions on how to ensure the APK build maintains fidelity with the editor preview would be incredibly helpful. https://youtube.com/shorts/BzQq_aj-mfo Thank you in advance for your help!515Views0likes0CommentsDesignated Photogrammetry App?
When you open up a mixed reality app on the Quest 3, it has you take a scan of your space so the game knows where to place things in the 'real world'. When I saw this happening for the first time I was mesmerized. The room I was in was very quickly translated to untextured polygons and only got more accurate the more I looked around things, for example, my fan went from a large cylinder, to an extremely accurate render of my fan as I went above and around it. I would love to see if the Quest 3 could even house a photogrammetry app, seeing as the depth sensors could make seriously accurate models in real time. If It did exist, I personally wouldn't change a thing about the model rendering portion of it, but to add a texture mapping second part would also be so cool. This is just an idea as I have no idea how to code, but I'm not above starting if someone could point me in the right direction because well, I have no idea where I'd start with making software for the Quest 3.1.3KViews1like1CommentUnity Build extremely broken
Hello there, I've been making a small VR scene in Unity. Everything is fine when im playing in the editor but when I make a build and load it through the Meta Quest Developer Hub it looks like the pic below! I've tried disabling the lightmaps and make everything realtime cause I thought that could have been the issue but it didn't fix anything. I can't even really google it because im not sure what is happening and every search I already tried didn't really help. I want to say that some months ago, like in, september, i was able to build the scene just fine. I added some extra models and an extra interactable, but nothing else that could suggest breaking the build like that. Also im using the v56 of the occulus integration package because that's the package i had when i started the project and i didnt want to break anything by upgrading (lmao). Any ideas? Also an extra problem, even though the game is only using hand tracking the Oculus is forcing me to use controllers and wont even let me launch the app. Edit: I upgraded the project to the latest package (Meta XR all in one) but unfortunately it didn't help Edit 2: I finally realised the problem was with an ice shader i was using for some ice cube in front. I was literally rotating and when it was going out of view everythign looked fine. Here is the shader i was using, which was working perfectly fine while testing through Air Link mind you. Idk why but it breaks the rendering when i have it. I will look further into it on my free time but if anyone has a guess feel free to reply https://github.com/lindenreid/Unity-Shader-Tutorials/blob/master/Assets/Materials/Shaders/ice.shaderSolved1.2KViews0likes1Comment