Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
bdeschryver's avatar
bdeschryver
Explorer
12 years ago

Rift on screen 1, normal camera on screen 2

Hey guys,

I'd like here to expose a setup that I have in mind and have your advice on this...

I would like to know if it is possible to have a Unity scene with an OVRplayercontroller to output image on the Rift (set as main screen) AND have a normal Unity camera in that same scene which outputs to a second screen (or window) which is looked at on the second PC screen.
This way, one user can use the Rift and enjoy a nice VR time, while another person sees on the PC screen what he sees or does, but on a 'normal' image (no distortion or image processing specific to the Rift).

It is similar to a setup with duplicated displays or using an HDMI splitter but having a flat 2D image on the PC screen rather than a distorded one identical to the one sent in the Rift...

Does Unity allow multi-window setup with one full screen for the Rift and another one for monitoring, using different cameras ?
If not, maybe in a multi-user environment where one user experiences the Rift on machine A and another one monitors on machine B ? But that is an heavy setup ;-)

THANKS !

6 Replies

Replies have been turned off for this discussion
  • drash's avatar
    drash
    Heroic Explorer
    In this post, there's a link to a video where they describe how they set up the EVE Valkyrie monitor at CES -- they used a scaler box.

    Others have suggested linking two computers via network code, which would then give you more flexibility in what kind of a viewpoint is actually shown on the normal monitor.
  • You could technically do this by calling the System.Windows.Forms namespace in a custom code (to detect the monitor setups), and then output the cameras to the individual displays. However, the game would be rendering twice, so in theory, you would get 1/2 FPS, maybe a bit more (?).

    As far as code, I have no idea, I just know the theory behind it. But it's a starting point, right?

    I think a better solution would to have a separate program that runs in the background that applies a "reverse Rift distortion" and sends that to the monitor. It could work, probably.
  • Thank you for your answers.
    I also believe that using a multiplayer networked system is the best, for the following reasons:
    - it does not require double rendering, so less load on the main machine (in fact you have 2 PCs)
    - you can have a separated camera, seeing a totally different image from the main Rift display
    - if PC B (for monitoring) crashes, PC A still works
    - if PC B is not used, PC A works
    - you can have an unlimited number of normal monitors or Rifts (each one having its own PC)

    The only "problem" is that it requires more PCs and some setup (software setup for multiplayer + hardware).

    I'll still give a try to multi-window rendering to see what the FPS is. On a good workstation, it might be ok to provide Rift display in 1280x800 + monitoring display in lower res like 1024*768. it is just for monitoring...
  • Yeah, this it totally something we are investigating. I don't think there is an easy way to do this with Unity right now.
  • owenwp's avatar
    owenwp
    Expert Protege
    Its possible with a native plugin that grabs the frame buffer and copies it to a secondary window, at least on windows. Avoiding pipeline stalls that increase latency would make the implementation finicky though.

    Spectator clients are the best way though, if you want high quality for recording or demos.