Conflicting Information in the Horizon OS SBC (Shader Binary Cache) Documentation?
In the documentation regarding building a shader binary cache per-platform (link) the documentation states: Using this feature, once one user starts the app and manually builds the SBC, all other users with the same device and software (Horizon OS, graphics driver, and app) will be able to avoid the shader generation process by downloading a copy of a pre-computed SBC. However, later on the same page, it states there is an automation in place to launch the apps and perform scripted prewarming logic if requested. The system automatically identifies and processes Oculus OS builds and app versions that require shader cache assets. It generates and uploads these assets to the store backend and automatically installs them during an app install or update. Does this feature support both of those setups? If I am not scripting any custom warmup logic, will shader binary caches still be shared between users with identical setups? IE, if I simply play the release candidate on the target OS version/hardware, will my SBC be automatically uploaded, or are SBCs only distributed when a scripted warmup sequence is present? Few details are provided regarding SBCs from other users being uploaded, so I'm curious if this is an inaccuracy or not. Thanks, excited to see features like this in Horizon OS. Very important for first time user experience.55Views0likes1CommentUnity WebRTC stream from iOS companion app → Quest headset connects but displays black frames
Hello everyone, My name is Mason. I’m a graduate student at Kennesaw State University and a Research Engineer working in XR environments through my Graduate Research Assistant role. I’m currently building a research prototype that connects a mobile companion application to a VR headset so that a VR user can view media stored on their phone inside a VR environment. The system uses a Unity-based mobile application to stream video frames to a Unity-based VR application using WebRTC Environment Sender Device: iPhone 15 OS: iOS 26.3 Engine: Unity 6000.3.8f1 (Unity 6.3) Graphics API: Metal Receiver Device: Meta Quest Pro headset (Unity application) Streaming Technology: Unity WebRTC package Architecture Mobile Unity app acts as the WebRTC sender Quest Unity app acts as the WebRTC receiver Connection established over LAN UDP used for discovery TCP used for signaling Video Source: Unity RenderTexture Goal The goal of the system is to allow a VR user to browse and view media stored on their phone inside a VR environment. The pipeline currently works as follows: The mobile Unity app renders media content to a RenderTexture The RenderTexture is used to create a WebRTC video track The video track is streamed to the headset The Quest app receives the track and displays it on a surface inside the VR scene Current Status Connection setup appears to work correctly. Observed behavior: Discovery between devices works Signaling connection succeeds ICE candidates exchange successfully PeerConnection state becomes Connected Video track is created and negotiated However, the Quest application displays only black frames. Sender (iOS) Behavior Inside the phone application, the RenderTexture displays correctly and the scene renders normally. Frames appear correct locally inside the Unity scene. Despite this, the Quest receiver does not display the frames. Receiver (Quest) Behavior On the Quest side, the WebRTC connection establishes successfully and the video track appears active. The video texture updates, but the displayed output is completely black. Expected Behavior The frames rendered on the phone should appear in the VR scene on the Quest headset. Actual Behavior The WebRTC connection works, but the Quest receiver only shows black frames. Things I Am Investigating Unity WebRTC compatibility with Unity 6.3 Metal texture capture limitations on iOS RenderTexture pixel format compatibility GPU readback or synchronization issues Differences between desktop Unity WebRTC streaming and iOS streaming Questions Has anyone successfully streamed Unity RenderTextures from iOS to Quest using WebRTC Are there known compatibility issues with Metal-based textures being used as WebRTC sources? Are there specific RenderTexture formats or texture types required for WebRTC on Quest Could this behavior indicate a GPU synchronization or pixel format issue? I can provide Unity console logs, WebRTC negotiation logs, screenshots of sender and receiver output, RenderTexture configuration, and minimal code snippets if needed. If anyone has experience building mobile-to-Quest streaming pipelines or using WebRTC in XR applications, I would greatly appreciate any guidance. Thank you for your time.32Views0likes1CommentGlitchy rendering on certain camera angles
This bug only appears in device build and on certain camera angles. Sometimes when the player looks up or when the controllers are out of the camera frame. I have used the default VR Controller from UnityXR Toolkit and this glitch doesnt show. I am currently using the prefab controller for pose detection scene and added the locomotion prefab to add movements. I am using URP for my project. What I've tried: - Removed the tunneling vignette object from the controller because maybe its conflicting with the rendering - Adjusted near and far clippings of left,center, and right cameras - Played with XR settings Single Pass Instanced / Multiview Unity Version: 6000.3.3f127Views0likes1CommentQuestion about unexpected 16:9 thumbnails shown in Meta Air Link (Steam app, Unity)
Hello, I am trying to identify the source file or mechanism used for the thumbnails displayed in Meta Quest Air Link, as I am seeing unintended images. Issue description When I start Air Link on Meta Quest and open Steam on the PC, the following thumbnails are displayed: The thumbnail shown when Steam appears in Air Link The 16:9 thumbnails shown in the Library / Apps view after launch In both cases, the thumbnails displayed are not the images I expect. Application setup The application is launched from Steam on the PC Rendering is done via Air Link The application is not registered as a Meta (Oculus) app The application is not registered via OpenVR / SteamVR .vrmanifest The application is developed using Unity What I want to understand I am not trying to customize the thumbnail at this stage. I want to accurately identify where these thumbnails are coming from. Specifically: Which file or resource is used as the source of the thumbnail shown in Air Link? Executable icon Windows window thumbnail (DWM) Steam library image Meta Quest Link internal cache Other mechanism Are the thumbnails generated dynamically from a running window, or loaded from a static image file? Is there any documentation describing how Air Link selects these thumbnails for Steam-launched applications? Any clarification on how Air Link determines and displays these thumbnails would be greatly appreciated. Thank you for your time.69Views0likes3CommentsOverdraw Best Practices
Overdraw is a silent performance killer in VR development. In this workshop, Meta Horizon Start Mentor Sidney breaks down what overdraw is, why it drains GPU resources, and how to fix it in Unity. Learn why relying on Unity’s default settings (like a 1KM draw distance) or the SRP batcher isn’t enough to prevent overdraw. Sidney walks through practical, simple solutions including smart level design, utilizing occlusion culling (especially for indoor scenes), and leveraging the Unity Frame Debugger to catch pixel fill issues early in the greybox phase. The session also covers the specific challenges of overdraw in procedurally generated levels. This session was recorded in March 2026 as part of the Meta Horizon Start program. 🎬 CHAPTERS 00:00 - Welcome & Introduction to Overdraw 00:15 - Speaker Intro: Sidney (Angelsin) 01:14 - Defining Overdraw and the Rendering Pipeline 02:08 - The Performance Impact of Overdraw 02:50 - Unity's Role and Limitations in Handling Overdraw 03:19 - Risks of Mesh Combining and Dynamic Objects 04:03 - Demonstrating Overdraw with Scene and Debug Tools 06:15 - Reducing Overdraw: Adjusting Draw Distance 07:55 - Reducing Overdraw: Occlusion Culling 08:21 - Overdraw Challenges in Procedural Generation 09:10 - Using the Frame Debugger for Optimization 10:10 - Conclusion and Summary of Best Practices 🎮 FEATURED IN THIS SESSION ➡️ Unity Frame Debugger: https://docs.unity3d.com/Manual/FrameDebugger.html ➡️ Unity Occlusion Culling: https://docs.unity3d.com/Manual/OcclusionCulling.html 📚 RESOURCES ➡️ Meta Horizon Developer Forum: https://communityforums.atmeta.com/category/horizon-developer-forum ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with hands-on support and expert guidance to accelerate app development. Join a thriving community to access the tools and go-to-market resources you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
10Views0likes0CommentsHow Can I Turn Off the "Frames Per Second" Overlay on My Quest 3?
Every time I turn on my Quest 3, there is an unwanted "Frames Per Second" overlay that pops up, and stays on my screen no matter what I do; and I really want to turn it off! See the attached picture that I mocked-up as an example of what I keep seeing so you'll understand what I'm dealing with. Please Help if you can! Thank You in Advance!269Views1like4Comments[Audit] OS 2.1 Nav Bar: AOSP Java Implementation & The Sideloading Paradox
As a Web/Multimedia Developer (B.A. Comm), I am formally auditing the persistent UI limitations in OS 2.1. Meta continues to treat the Global Navigation Bar and System Menus as a static kiosk interface. The AOSP / Java Reality The Navigator is a View group within a Java-based AOSP (Android Open Source Project) framework. There is zero architectural excuse for failing to implement transparency (alpha) and color customization. If the community can port Quake to HTML5, Meta's engineering team can certainly implement a basic slider for the UI layer. The Sideloading Paradox The claim that UI customization is "too complex" for users is logically inconsistent. A significant portion of this community is already using ADB (Android Debug Bridge) and Sideloading to bypass these restrictions. If a user has the technical literacy to sideload an APK, they are more than capable of using a native hex-code color picker or a transparency toggle. Figure-Ground and Accessibility By hard-coding Elevation and Ambient/Key shadows without user-definable variables, Meta has created a permanent Figure-Ground failure. A professional-grade OS requires user-defined transparency to ensure visual accessibility. Conclusion: We are developers and owners, not tenants. Meta needs to stop gatekeeping basic UI variables. Implement the alpha sliders for the Navigation Bar and Menu.60Views0likes3CommentsMeta Quest 3, Unreal 5.7: Need help with trying to get Opacity to work.
Hi, is there a way to get Opacity to work on Meta Quest 3? The simplest and most obvious way in UE5.7 obviously refuses to work on VR glasses but I was wondering if there was some work around on this? I tried to look everywhere for any glues how to get it to work but googling around hadn't yield any results. I'd be most grateful for any advice and help I can get.179Views0likes1CommentWebGPU Compute into WebXR on Quest
Anyone know the expected date for Meta Quest's WebGPU-WebXR layer? i just purchased a MetaQuest 3, to complement my Quest 2, for WebXR development *with* WebGPU ( Compute-Shader only Voxel/SDF engine ) and found Meta-Browser's doesn't support WebGPU-WebXR, a 'Chromium 134' stable feature. Suprising since Quest 3 is a "Flagship of XR" device ( in terms of sales/popularity/development ). Reference check here: https://immersive-web.github.io/webxr-samples/webgpu/ i've web-searched extensively yet not found a workaround/flag to set or anything to do other than the suggestion to build a WebGPU copy into WebGL context ( wasting bandwidth/VRAM on copying the XR canvas? ) Am i missing anything? Thx---245Views1like7Comments