Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
dodderz's avatar
dodderz
Honored Guest
11 years ago

Rendertexture question

In Unity 5.1 I'm attempting to render an object to a render texture and then overlay that object onto my main camera using screenoverlay (modified to allow use of a render texture rather than just a texture). I want to composite the two cameras. This appears to work ok in the 2D view of the game in the editor, but in the rift it's just plain weird. I can't quite understand what's going on. To get the effect I've attached my secondary camera as a child of the main camera, hooked that up to a rendertexture and then added a screenoverlay to my main camera that takes the rendertexture as an input.

As Unity splits the main camera into two, I think it's using one rendertexture rendered into both of my main cameras, rather than duplicating the process for the left/right cameras. Effectively it's a 2D image overlaid on the cameras rather than a stereoscopic one.

Is there a way to achieve what I want and composite 2 3D cameras together? The intention is to be able to fade in and out the contents of the secondary camera.

1 Reply

Replies have been turned off for this discussion
  • As far as I know, it's not yet possible to composite stereo images using the Unity 5.1 VR support.