WebGPU Compute into WebXR on Quest
Anyone know the expected date for Meta Quest's WebGPU-WebXR layer? i just purchased a MetaQuest 3, to complement my Quest 2, for WebXR development *with* WebGPU ( Compute-Shader only Voxel/SDF engine ) and found Meta-Browser's doesn't support WebGPU-WebXR, a 'Chromium 134' stable feature. Suprising since Quest 3 is a "Flagship of XR" device ( in terms of sales/popularity/development ). Reference check here: https://immersive-web.github.io/webxr-samples/webgpu/ i've web-searched extensively yet not found a workaround/flag to set or anything to do other than the suggestion to build a WebGPU copy into WebGL context ( wasting bandwidth/VRAM on copying the XR canvas? ) Am i missing anything? Thx---166Views1like7CommentsUnity WebRTC stream from iOS companion app → Quest headset connects but displays black frames
Hello everyone, My name is Mason. I’m a graduate student at Kennesaw State University and a Research Engineer working in XR environments through my Graduate Research Assistant role. I’m currently building a research prototype that connects a mobile companion application to a VR headset so that a VR user can view media stored on their phone inside a VR environment. The system uses a Unity-based mobile application to stream video frames to a Unity-based VR application using WebRTC Environment Sender Device: iPhone 15 OS: iOS 26.3 Engine: Unity 6000.3.8f1 (Unity 6.3) Graphics API: Metal Receiver Device: Meta Quest Pro headset (Unity application) Streaming Technology: Unity WebRTC package Architecture Mobile Unity app acts as the WebRTC sender Quest Unity app acts as the WebRTC receiver Connection established over LAN UDP used for discovery TCP used for signaling Video Source: Unity RenderTexture Goal The goal of the system is to allow a VR user to browse and view media stored on their phone inside a VR environment. The pipeline currently works as follows: The mobile Unity app renders media content to a RenderTexture The RenderTexture is used to create a WebRTC video track The video track is streamed to the headset The Quest app receives the track and displays it on a surface inside the VR scene Current Status Connection setup appears to work correctly. Observed behavior: Discovery between devices works Signaling connection succeeds ICE candidates exchange successfully PeerConnection state becomes Connected Video track is created and negotiated However, the Quest application displays only black frames. Sender (iOS) Behavior Inside the phone application, the RenderTexture displays correctly and the scene renders normally. Frames appear correct locally inside the Unity scene. Despite this, the Quest receiver does not display the frames. Receiver (Quest) Behavior On the Quest side, the WebRTC connection establishes successfully and the video track appears active. The video texture updates, but the displayed output is completely black. Expected Behavior The frames rendered on the phone should appear in the VR scene on the Quest headset. Actual Behavior The WebRTC connection works, but the Quest receiver only shows black frames. Things I Am Investigating Unity WebRTC compatibility with Unity 6.3 Metal texture capture limitations on iOS RenderTexture pixel format compatibility GPU readback or synchronization issues Differences between desktop Unity WebRTC streaming and iOS streaming Questions Has anyone successfully streamed Unity RenderTextures from iOS to Quest using WebRTC Are there known compatibility issues with Metal-based textures being used as WebRTC sources? Are there specific RenderTexture formats or texture types required for WebRTC on Quest Could this behavior indicate a GPU synchronization or pixel format issue? I can provide Unity console logs, WebRTC negotiation logs, screenshots of sender and receiver output, RenderTexture configuration, and minimal code snippets if needed. If anyone has experience building mobile-to-Quest streaming pipelines or using WebRTC in XR applications, I would greatly appreciate any guidance. Thank you for your time.16Views0likes0CommentsMeta Quest 3, Unreal 5.7: Need help with trying to get Opacity to work.
Hi, is there a way to get Opacity to work on Meta Quest 3? The simplest and most obvious way in UE5.7 obviously refuses to work on VR glasses but I was wondering if there was some work around on this? I tried to look everywhere for any glues how to get it to work but googling around hadn't yield any results. I'd be most grateful for any advice and help I can get.120Views0likes1CommentAccessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.125Views1like0CommentsQuestion about unexpected 16:9 thumbnails shown in Meta Air Link (Steam app, Unity)
Hello, I am trying to identify the source file or mechanism used for the thumbnails displayed in Meta Quest Air Link, as I am seeing unintended images. Issue description When I start Air Link on Meta Quest and open Steam on the PC, the following thumbnails are displayed: The thumbnail shown when Steam appears in Air Link The 16:9 thumbnails shown in the Library / Apps view after launch In both cases, the thumbnails displayed are not the images I expect. Application setup The application is launched from Steam on the PC Rendering is done via Air Link The application is not registered as a Meta (Oculus) app The application is not registered via OpenVR / SteamVR .vrmanifest The application is developed using Unity What I want to understand I am not trying to customize the thumbnail at this stage. I want to accurately identify where these thumbnails are coming from. Specifically: Which file or resource is used as the source of the thumbnail shown in Air Link? Executable icon Windows window thumbnail (DWM) Steam library image Meta Quest Link internal cache Other mechanism Are the thumbnails generated dynamically from a running window, or loaded from a static image file? Is there any documentation describing how Air Link selects these thumbnails for Steam-launched applications? Any clarification on how Air Link determines and displays these thumbnails would be greatly appreciated. Thank you for your time.54Views0likes2CommentsOVROverlayCanvas broken across whole project
I decided to try out OVROverlayCanvas with a small test, and sure enough, it turned my fuzzy UI into crisp, perfect quality UI. So I made the necessary changes across the entirety of my project. Things were looking great in the build. Then, for no apparent reason, every instance of OVROverlayCanvas across multiple scenes broke. They went from looking like this: …to looking like this while running in a build: Between when it was working and when it failed, the scene from which I've taken the above screenshots, has had no changes at all. Any suggestions? I’m working in Unity 6.3.1f1 with Meta XR SDKs v 83.0.1, OpenXR Plugin v.1.16.1378Views0likes0CommentsError with Unity Vulkan Dynamic Resolution Despite Supported Version?
I'm on a project on Unity 6000.0.58f2, and can't seem to get dynamic resolution working as expected on a build for the Quest 3. OVRMetrics shows that dynamic resolution isn't working (the resolution is fixed). When I activate dynamic resolution, I get the following error: Error Unity Vulkan Dynamic Resolution is not supported on your current build version. Ensure you are on Unity 2021+ with the Oculus XR plugin v3.3.0+ or the Unity OpenXR plugin v1.12.1+ However, our project is on 4.5.2 of the Oculus XR Plugin. The error suggests this version is supported. Has anyone else had this error, and is anyone aware of a solution (or a supported version of the engine/plugin?)86Views0likes3CommentsOVROverlayCanvas breaking at runtime
I decided to try out OVROverlayCanvas with a small test, and sure enough, it turned my fuzzy UI into crisp, perfect quality UI. So I made the necessary changes across the entirety of the project. Things were looking great in the build. Then, for no particular reason I can find, every instance of OVROverlayCanvas across multiple scenes broke. They went from looking like this: ...to looking like this (at runtime) Between working and not working, the scene from which I took the above screenshots, has had no changes at all. Any suggestions? I’m working in Unity 6.3.1f1, with Meta XR SDK v83.0.173Views0likes3CommentsStylized passthrough: How can i retexture walls?
Meta Horizons documentation on Scenes gives this image as an example of a Basic stylized passthrough. This looks to me like a screenshot of a stylized hall way. How can such an effect be accomplished in Kotlin without using Unity or Unreal? Can this effect also be achieved on Quest 2 or only on Quest 3(s)? The article mentions that Assisted scene capture (available on Quest 3(s) only) shouldn't be used to create such an effect.22Views0likes0Comments