Unity WebRTC stream from iOS companion app → Quest headset connects but displays black frames
Hello everyone,
My name is Mason. I’m a graduate student at Kennesaw State University and a Research Engineer working in XR environments through my Graduate Research Assistant role.
I’m currently building a research prototype that connects a mobile companion application to a VR headset so that a VR user can view media stored on their phone inside a VR environment.
The system uses a Unity-based mobile application to stream video frames to a Unity-based VR application using WebRTC
Environment
- Sender Device: iPhone 15
- OS: iOS 26.3
- Engine: Unity 6000.3.8f1 (Unity 6.3)
- Graphics API: Metal
- Receiver Device: Meta Quest Pro headset (Unity application)
- Streaming Technology: Unity WebRTC package
Architecture
- Mobile Unity app acts as the WebRTC sender
- Quest Unity app acts as the WebRTC receiver
- Connection established over LAN
- UDP used for discovery
- TCP used for signaling
- Video Source: Unity RenderTexture
Goal
The goal of the system is to allow a VR user to browse and view media stored on their phone inside a VR environment.
The pipeline currently works as follows:
- The mobile Unity app renders media content to a RenderTexture
- The RenderTexture is used to create a WebRTC video track
- The video track is streamed to the headset
- The Quest app receives the track and displays it on a surface inside the VR scene
Current Status
Connection setup appears to work correctly.
Observed behavior:
- Discovery between devices works
- Signaling connection succeeds
- ICE candidates exchange successfully
- PeerConnection state becomes Connected
- Video track is created and negotiated
However, the Quest application displays only black frames.
Sender (iOS) Behavior
Inside the phone application, the RenderTexture displays correctly and the scene renders normally. Frames appear correct locally inside the Unity scene.
Despite this, the Quest receiver does not display the frames.
Receiver (Quest) Behavior
On the Quest side, the WebRTC connection establishes successfully and the video track appears active. The video texture updates, but the displayed output is completely black.
Expected Behavior
The frames rendered on the phone should appear in the VR scene on the Quest headset.
Actual Behavior
The WebRTC connection works, but the Quest receiver only shows black frames.
Things I Am Investigating
- Unity WebRTC compatibility with Unity 6.3
- Metal texture capture limitations on iOS
- RenderTexture pixel format compatibility
- GPU readback or synchronization issues
- Differences between desktop Unity WebRTC streaming and iOS streaming
Questions
- Has anyone successfully streamed Unity RenderTextures from iOS to Quest using WebRTC
- Are there known compatibility issues with Metal-based textures being used as WebRTC sources?
- Are there specific RenderTexture formats or texture types required for WebRTC on Quest
- Could this behavior indicate a GPU synchronization or pixel format issue?
I can provide Unity console logs, WebRTC negotiation logs, screenshots of sender and receiver output, RenderTexture configuration, and minimal code snippets if needed.
If anyone has experience building mobile-to-Quest streaming pipelines or using WebRTC in XR applications, I would greatly appreciate any guidance.
Thank you for your time.