Unity WebRTC stream from iOS companion app → Quest headset connects but displays black frames
Hello everyone, My name is Mason. I’m a graduate student at Kennesaw State University and a Research Engineer working in XR environments through my Graduate Research Assistant role. I’m currently building a research prototype that connects a mobile companion application to a VR headset so that a VR user can view media stored on their phone inside a VR environment. The system uses a Unity-based mobile application to stream video frames to a Unity-based VR application using WebRTC Environment Sender Device: iPhone 15 OS: iOS 26.3 Engine: Unity 6000.3.8f1 (Unity 6.3) Graphics API: Metal Receiver Device: Meta Quest Pro headset (Unity application) Streaming Technology: Unity WebRTC package Architecture Mobile Unity app acts as the WebRTC sender Quest Unity app acts as the WebRTC receiver Connection established over LAN UDP used for discovery TCP used for signaling Video Source: Unity RenderTexture Goal The goal of the system is to allow a VR user to browse and view media stored on their phone inside a VR environment. The pipeline currently works as follows: The mobile Unity app renders media content to a RenderTexture The RenderTexture is used to create a WebRTC video track The video track is streamed to the headset The Quest app receives the track and displays it on a surface inside the VR scene Current Status Connection setup appears to work correctly. Observed behavior: Discovery between devices works Signaling connection succeeds ICE candidates exchange successfully PeerConnection state becomes Connected Video track is created and negotiated However, the Quest application displays only black frames. Sender (iOS) Behavior Inside the phone application, the RenderTexture displays correctly and the scene renders normally. Frames appear correct locally inside the Unity scene. Despite this, the Quest receiver does not display the frames. Receiver (Quest) Behavior On the Quest side, the WebRTC connection establishes successfully and the video track appears active. The video texture updates, but the displayed output is completely black. Expected Behavior The frames rendered on the phone should appear in the VR scene on the Quest headset. Actual Behavior The WebRTC connection works, but the Quest receiver only shows black frames. Things I Am Investigating Unity WebRTC compatibility with Unity 6.3 Metal texture capture limitations on iOS RenderTexture pixel format compatibility GPU readback or synchronization issues Differences between desktop Unity WebRTC streaming and iOS streaming Questions Has anyone successfully streamed Unity RenderTextures from iOS to Quest using WebRTC Are there known compatibility issues with Metal-based textures being used as WebRTC sources? Are there specific RenderTexture formats or texture types required for WebRTC on Quest Could this behavior indicate a GPU synchronization or pixel format issue? I can provide Unity console logs, WebRTC negotiation logs, screenshots of sender and receiver output, RenderTexture configuration, and minimal code snippets if needed. If anyone has experience building mobile-to-Quest streaming pipelines or using WebRTC in XR applications, I would greatly appreciate any guidance. Thank you for your time.10Views0likes0CommentsNPCs with Avatar/Style2Meta material are invisible on Meta Quest
I've rigged, animated, and imported avatars into my Unity project and scene as prefabs. I've created new materials for them with the same shader as my player avatar utilizes when he loads in - Avatar/Style2Meta. When playing the project on my laptop, I can see my Avatar NPCs with those materials fine. As soon as I build and run the project on my Meta Quest 2, they become invisible. I really like the shader, so I wanna keep it and not compromise for using two different materials for player and NPC avatars. Why does it happen only when playing on the headset, and how do I fix it?37Views0likes2CommentsUSB: Input-Only Microphone Should Not Mute Built-in Speakers, Technical Analysis & Proposed Fix
When any USB-C audio device is connected to Quest 3 — even an input-only microphone with no speaker/DAC capability — Meta Horizon OS routes ALL audio (both input AND output) to the USB-C port and mutes the built-in headset speakers. This contradicts standard Android AOSP behavior and blocks legitimate professional use cases. The Problem Many professional VR applications need external microphone input (for speech recognition, recording, or communication) while maintaining audio output through the built-in speakers. Examples include therapeutic VR, education, accessibility, content creation, live streaming, and enterprise training. We purchased a USB-C gooseneck microphone that is input-only (no speaker, no DAC, isSink=false). On standard Android devices, connecting this mic only affects audio input — speakers continue working. On Quest 3, the built-in speakers are immediately muted, even though the USB device has zero output capability. What We've Tried (Everything Fails) 1. usb_audio_automatic_routing_disabled=1 (ADB): Does not selectively disable routing — prevents the USB device from registering with AudioService entirely (UsbAlsaManager.selectAlsaDevice() returns early), so setPreferredDevice() cannot find the mic at all. 2. AudioManager.setCommunicationDevice(builtInSpeaker) (API 31+): Only affects USAGE_VOICE_COMMUNICATION streams, not media/game audio. Unity uses FMOD → AAudio (native C layer), which routes through USAGE_GAME — unaffected. 3. AudioTrack.setPreferredDevice(builtInSpeaker): Would require intercepting the engine's internal audio output at the native layer — not feasible, and Quest 3's audio HAL may override it anyway. 4. "External Microphone" toggle (Settings > Advanced, v64+): Enables USB mic recognition only. Does NOT provide split input/output routing. 5. Input-only USB mic (isSource=true, isSink=false): Expected AOSP-compliant behavior (only input rerouted). Built-in speakers are still muted. Root Cause Analysis — AOSP vs. Meta Horizon OS In upstream AOSP, UsbAlsaManager.java checks actual device capabilities via USB Audio Class descriptors: // AOSP: frameworks/base/services/usb/java/com/android/server/usb/UsbAlsaManager.java private void selectAlsaDevice(UsbAlsaDevice alsaDevice) { UsbDescriptorParser parser = alsaDevice.getParser(); if (parser.hasOutput()) { // Only register OUTPUT if USB device has playback capability alsaDevice.startOutput(); } if (parser.hasInput()) { // Only register INPUT if USB device has capture capability alsaDevice.startInput(); } } The AOSP AudioPolicyManager then only reroutes streams matching registered capabilities. An input-only device never triggers checkOutputsForDevice(), so speakers remain active. Meta's Horizon OS overrides this separation. The most likely cause: // Probable Meta override (simplified): void AudioPolicyManager::onNewUsbDevice(audio_devices_t device) { // Does not check if device has output capability setDeviceConnectionState(AUDIO_DEVICE_OUT_SPEAKER, AUDIO_POLICY_DEVICE_STATE_UNAVAILABLE); setDeviceConnectionState(AUDIO_DEVICE_OUT_USB_DEVICE, AUDIO_POLICY_DEVICE_STATE_AVAILABLE); } Proposed Fix Check the USB device's Audio Class descriptors before modifying output routing: void AudioPolicyManager::onNewUsbDevice(const sp<UsbAlsaDevice>& device) { if (device->hasCapture()) { setDeviceConnectionState(AUDIO_DEVICE_IN_USB_DEVICE, AUDIO_POLICY_DEVICE_STATE_AVAILABLE); } if (device->hasPlayback()) { // Route output to USB ONLY if device has output capability setDeviceConnectionState(AUDIO_DEVICE_OUT_USB_DEVICE, AUDIO_POLICY_DEVICE_STATE_AVAILABLE); } // If !hasPlayback(): leave built-in speaker routing UNCHANGED } This is a single conditional check in the audio policy layer. It requires zero UI changes and simply aligns Quest 3 with upstream AOSP behavior. Additional Solutions (If a Broader Fix is Planned) - User-facing setting: Add "Audio Output" under Settings > Sound with options "Headset Speakers" / "USB-C" / "Automatic", independent of input routing. - Developer API: Allow applications to call setPreferredDevice() with routing respected for all audio usages (not just USAGE_VOICE_COMMUNICATION). Impact The Quest 3 hardware is fully capable — speakers and USB operate on independent audio paths. This is purely a software routing policy that could be resolved with a minimal code change. The fix would unblock every developer building applications that need external audio input while maintaining speaker output. Happy to provide dumpsys audio output or USB device descriptors to help diagnose the exact policy override. Related: A similar request was posted on the archived forum: "Request for Enhanced Audio Routing Controls with External Microphones on Meta Quest" (jrb-vr, October 2024) — which received no response.11Views0likes0CommentsHands only not working on Unity
On Unity 6, with the V85 of Meta SDK, I choose in "Hand Tracking Support" the "Hands only" mode so controllers aren't supposed to be available in build. However, controllers are still available in my app so I tried to use the "Controller Only" mode so it disable correctly the hand tracking in my app and it shows up a window to use controllers when I try to lauch my app. Why it's not making the same for hand tracking mode ?19Views0likes0CommentsHow to Build Player Retention Systems for Social VR games | Fast Essentials
Players who finish your tutorial still need a reason to come back, and Start Mentor Tevfik has a framework for exactly that. In this session he breaks down the Retention Triangle: how quests guide players through your game, how badges build visible identity, and how free Meta Platform SDK leaderboards keep your world feeling alive. 💡 By watching this video, you will learn: How to design quests that double as guided tutorials How cosmetic badges and “OG” status markers give players a visible identity How to implement a working leaderboard using the free Meta Platform SDK How daily resets and short-term challenges support long-term retention This session was recorded live in January 2026 as part of the Meta Horizon Start program. 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00: Welcome and intro to the Retention Triangle 📐 DESIGNING QUESTS FOR RETENTION 🕒 02:21: Structuring Quests and the Challenge System Architecture 🏅 IDENTITY SYSTEMS 🕒 05:13: Badges and Leaderboards 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
7Views0likes0CommentsBuild a VR Economy with Leaderboards & In-Game Currency | Fast Essentials
If VR leaderboards give your players a reason to compete, then it’s a robust in-game economy that compels them to stay. In this Start Mentor workshop, Tevfik gives you the full rundown on both systems using a live Unity project. On one side you get Meta Platform SDK leaderboards; on the other, a PlayFab backend that manages virtual currency and in-app purchases across devices. 💡 By watching this video, you will learn: The Meta Platform SDK includes a free leaderboard system that links directly to player profiles and tracks metrics like session joins or high scores. Microsoft PlayFab serves as a cloud backend for managing virtual currency and player inventory across devices. Entitlement checks verify app ownership and protect player data so that progress follows them to any headset. Meta's in-app purchase flow handles the real-money transaction while your backend manages the actual currency grant. This session was recorded live in January 2026 as part of the Meta Horizon Start program. 🎬 CHAPTERS 👋 INTRODUCTION 🕒 00:00: Introduction to Social Systems in Baby VR 📊 LEADERBOARD SETUP 🕒 01:51: Visualizing Leaderboards and Currency in Unity 🕒 03:55: Setting Up Leaderboards in the Meta Dashboard 🕒 06:12: Coding the Leaderboard Logic 💰 ECONOMY AND PERSISTENCE 🕒 08:10: Managing Game Economy with PlayFab 🕒 09:58: Entitlements and Cross-Device Persistence 🕒 12:08: Implementing In-App Purchases 📚 RESOURCES ➡️ Developers Blog: https://developers.meta.com/resources/blog/ ➡️ Meta Quest Developer Hub: https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ 🔗 CONNECT WITH US ➡️ Sign up to get the latest news from Meta Horizon: https://developers.meta.com/horizon/newsletter 💡 LEARN ABOUT THE META HORIZON START PROGRAM The Meta Horizon Start program provides intermediate and advanced developers with the resources, hands-on support, and expert guidance needed to accelerate their app development. Join a thriving community to get the tools and go-to-market guidance you need to successfully deploy and grow your app on Meta Horizon OS. Apply to Start today: https://developers.meta.com/horizon/discover/programs/start
6Views0likes0CommentsI Almost Overdesigned My VR Game to Death
There’s a phase in game development that nobody really warns you about. It’s not the “I can’t code this” phase. It’s not the “I ran out of money” phase. It’s not even the “no one is playing my game” phase. It’s when your own ideas start overwhelming the game. That’s where I found myself recently. I have a Social VR game currently live on the Meta Horizon Store. And this is my story about how design — not bugs — became my biggest struggle. The Dangerous Kind of Productivity After publishing my game (3 months ago), the early months were manageable. There were bugs to fix. Core features to improve. Community expectations were still forming. But as time passed, growth slowed. And I felt stuck. Not because I had no ideas. Because I had too many good ones. New abilities Leveling systems Advanced control modes More immersive camera options Dynamic AI creatures Lore layers Progression trees World events Each one is exciting. Each one is defensible. Each one “adding depth.” And each one is making the game heavier. From the outside, it looked like progress. From the inside, it felt like friction. VR Makes It Worse In VR, every feature multiplies complexity. A new ability isn’t just a new mechanic —it affects comfort, cognitive load, UI clarity, and social balance. A new camera mode isn’t just visual —it changes perception and can introduce motion discomfort. A new progression system isn’t just numbers —it affects motivation, fairness, and retention. Everything touches everything. And when you stack systems without tightening the core, the experience starts to blur. The Subtle Identity Drift The scariest question I had to ask myself was: What is this game actually about? Is it skill-based? Is it social? Is it progression-driven? Is it a sandbox? Is it competitive? Is it experimental? When you add features faster than you refine your foundation, your game slowly loses its center. Not dramatically. Just enough that every new decision becomes harder. That uncertainty is exhausting. The Ambition Trap Overdesign often comes from passion. You care. You want your game to stand out. You want depth. Growth. Surprise. So you build. And build. And build. Until one day you realize you’ve created something impressive… but unclear. Complexity Feels Like Depth — But It Isn’t This was the lesson I had to learn: Depth comes from mastery of a strong core. Complexity comes from stacking. They are not the same thing. A single mechanic refined to excellence will carry a game further than five half-polished systems competing for attention. Especially in VR, where clarity of experience is everything. The Turning Point My shift wasn’t about cutting ideas. It was about asking a harder question before adding anything new: Does this strengthen the core loop? Not: “Is this cool?” Not: “Will players like this feature?” Not: “Will this make the game deeper?” But: Does this make the core experience clearer and stronger? If the answer wasn’t obvious, it didn’t belong — at least not yet. The Real Struggle Isn’t Technical Most developers think the hard part is engineering. In my experience, the real struggle is restraint. It’s saying no to good ideas. It’s choosing focus over ambition. It’s realizing that sometimes your game doesn’t need more mechanics. It needs a sharper identity. The Second Mistake: Retention Here’s something even harder to admit. After refocusing the core, I made another mistake. I didn’t give players a strong enough reason to come back. Clarity alone is not enough. Players need: progression competition meaningful goals something to improve at A strong core gets them in. Retention systems keep them returning. Balancing simplicity and long-term motivation is the real design challenge. If You’re Feeling Overwhelmed If your project feels heavier every week… If every feature you add creates two new design problems… If you keep “improving” the game but feel further from clarity… You’re not alone. You’re not bad at design. You might just be overdesigning. And that’s usually a sign you care. What I’m Learning Simplicity is not a lack of ambition. It’s disciplined ambition. I almost overdesigned my game to death. Now I’m learning that the strongest games aren’t built by stacking ideas. They’re built by protecting the core — and then carefully layering systems that support it. That lesson might be the most valuable part of this entire journey. If you’re building something in VR right now: What are you struggling with the most? Clarity? Retention? Scope? Motivation? Let’s talk.67Views1like0CommentsOptions for PVP Multiplayer Hosting
I have been building PVP games in Horizon Worlds for the last year, and now that VR in Horizon worlds is going away I am looking at doing a native PVP VR game instead. I have used both unreal engine and unity, was leaning on using unity because I feel I may program quicker in Unity. I have some cloud experience using AWS, mostly hosting websites. But for hosting or implementing Multiplayer, I am looking at options and others experiences. I am thinking of two main solutions, one being AWS (this is my go to for cloud tasks, but I use mostly for hosting websites) and proton (it sounds to good to be true, its free and you can have 100 concurrent users). AWS likely will be a pain to setup and seems like it will cost more (which makes no sense to me, usually cheapest for most tasks). Photon seems super cheap and easy to implement (sounds to good, feel like there is something I am missing). Have any tried both methods, and how do they compare, or are their better solutions.Solved57Views0likes3CommentsMeta XR Simulator Broken/Version Mismatch (81 vs 85)
i haven't been able to get the Meta XR Simulator to work inside unity3d. The Latest version of the Meta XR SDK is 85.0.0 but the Meta XR Simulator is 81.0.0. If i use the wizard to update the package, it gives an error trying to download it. If i try to manually update it, the package manager has a latest version of 81.0.0. And when i try to activate the simulator it will say its not found) If i downgrade everything to 81.0.0 it still has the issue. It seems like i am not the only person having this issue as the reviews have a lot of people having the same complaint: https://assetstore.unity.com/packages/tools/integration/meta-xr-simulator-266732#reviews Any suggestions on how to get the XR Simulator working? Or if it's just broken for everyone.Failed to initialize Insight Passthrough
For a few days, whenever I connect link to Unity editor, when I press Play, it shows this error: Failed to initialize Insight Passthrough. Please ensure that all prerequisites for running Passthrough over Link are met: https://developer.oculus.com/documentation/unity/unity-passthrough-gs/#prerequisites-1. Passthrough will be unavailable. Error Failure_NotInitialized. UnityEngine.Debug:LogError (object) OVRManager:InitializeInsightPassthrough () (at ./Library/PackageCache/com.meta.xr.sdk.core@e6e7a2c46b82/Scripts/OVRManager.cs:3667) OVRManager:InitOVRManager () (at ./Library/PackageCache/com.meta.xr.sdk.core@e6e7a2c46b82/Scripts/OVRManager.cs:2390) OVRManager:Awake () (at ./Library/PackageCache/com.meta.xr.sdk.core@e6e7a2c46b82/Scripts/OVRManager.cs:2524) It just happened, I don't use passthrough, in OVR manager everything is disabled. If I disconnect the LINK cable, the error disappears. The problem is that whenever I press Play, Unity starts in pause mode. If it's harmless, I can disable Error Pause option, but it s not a good idea for the long term, there may be other errors in the future which I don t want to miss. The Meta SDK version is 83.0.3. Thank you.37Views1like2Comments