04-18-2025 05:56 PM
When using the passthrough API, you can send a feed of the headset's camera via the WebcamTexture or alternatively, the Android API. I want to have GameObjects rendered in this view however, such that it looks like the streaming view or the ADB view. If I place a default Unity Cube in front of me, I want to be able to see that through the Passthrough API. As of right now, I cannot see it when using WebCamTexture. Is there support for this right now? Or are there any hacks to rasterize that GameObject through the API? Or is there any feature that would allow me to send the same feed that is used for streaming or recording from the headset through Unity?
04-25-2025 01:46 PM
Hello! From what I know about the Passthrough Camera API, it can only provide the raw camera feed from the Quest's RGB Passthrough cameras. You could use the Android MediaProjection API to get what is being drawn on the Quest's displays, similar to the video feed you get from casting. There is GitHub project that demos this feature: hsratneshsci/Quest-Camera-access
The disadvantage with this method is that everything that is drawn on the display will be viewable in the feed, and there's no way of excluding specific objects from it. If you're using the Passthrough Camera API, maybe you could have a custom shader that draws your GameObjects onto the WebCamTexture? You could also interface with the native Camera2 API in Unity to get the feeds of both the RGB cameras, which is not possible with Unity's WebCamTexture, and use them to estimate stereo vision. I have also developed an open source plugin on GitHub that allows you to do so, so maybe it can help you out 😄
Uralstech/UXR.QuestCamera