Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Marbas's avatar
Marbas
Honored Guest
13 years ago

Mirroring the game output.

Anyone come up yet with a creative solution as how to output a regular rectilinear game render to the monitor while mirroring the same (distortion corrected) output to the Rift? For now everyone seems happy with the distorted image mirrored on the monitor. But in the long run this isn't ideal. Its not good for presentaions/showcasing and its not good for reading text if you need to. I see Cymatic Bruce wearing off the HMD often to being able to read something on the monitor when needed. Not that is helps much since its mirrored and still distorted/lores.

So, can it be done somehow?

rift_output.jpg

9 Replies

  • spyro's avatar
    spyro
    Expert Protege
    Second this! A spectator mode would be great, otherwise VR will become a very isolating experience with no possibility to play together with friends.
  • Marbas's avatar
    Marbas
    Honored Guest
    There is one way... far from optimal, but should do the trick if presenting some rift-game to larger audiences waiting hours in line :mrgreen:

    It involves using 2 pc and of course your own game project. Im not so sure you could make this work for existing games tho.

    You could run your game on each computer and have some code handling some basic network connection (just to transmit your camera pos/rot) from the rift-pc to the 'public' machine hooked up to a monitor, tv or watever.

    Ups: It works!, freedom of resolution choice, no performance cost. Fast enough.

    Down's: 2 pc for 1 rift system is 1 pc to many.

    May be worthwhile for presentations, but not much more than that Im afraid.
  • Well, if you've written the engine yourself, you should note the following:

    The distortion shader is a post-process run on the final rendered image.

    And, to render side-by-side stereoscopic images, you need to render the scene twice.

    So... if you have enough of an ability to mess with the internals of your engine. (IE, you wrote it yourself), you can take one of the rendered images (doesn't really matter which; left eye or right. Both would work), then show that on the monitor, while then combining the two images and running the distortion shaders for output to the rift headset.

    The downsides to this is you'd need your engine to have proper multi-monitor awareness. (Which can be done, but usually isn't; Things like Eyefinity and the like don't count, because the graphics driver makes the game think that it's dealing with a single display.)

    The other downside is that a single image rendered for one eye of the rift would have a very odd resolution and aspect ratio.

    (640 by 800 at an 8:10 aspect ratio...)

    Of course, if presentation truly mattered, you could use a third camera, and render an entirely separate view for this external display.
    (But that would have more of a performance impact - If you plan it out properly, it's still not as much as rendering a 3rd view from a completely unrelated angle, but certainly not as fast as re-using an image you're rendering anyway.)

    It's more than possible create a game engine that supports multiple monitors properly, and can render multiple views to each one...
    It's just not something that many people have ever done.

    In fact, off the top of my head, even though I've seen the example code for it in the DirectX SDK since at least 2007, the only game I can think of that can properly recognise the existence of multiple monitors, and use more than one, is Supreme Commander 2.
    And all it does with that is render the map onto the second display.

    This is all an exercise in game engine coding though.

    "Marbas" wrote:
    There is one way... far from optimal, but should do the trick if presenting some rift-game to larger audiences waiting hours in line :mrgreen:

    It involves using 2 pc and of course your own game project. Im not so sure you could make this work for existing games tho.

    You could run your game on each computer and have some code handling some basic network connection (just to transmit your camera pos/rot) from the rift-pc to the 'public' machine hooked up to a monitor, tv or watever.

    Ups: It works!, freedom of resolution choice, no performance cost. Fast enough.

    Down's: 2 pc for 1 rift system is 1 pc to many.

    May be worthwhile for presentations, but not much more than that Im afraid.


    :D - Unless you have network code in the game already. (which, to be fair, a lot of games do), that's definitely not an optimal solution.
    Unfortunately, the 'correct' solution, while only requiring a single PC, takes a bit of work to code, and requires that you can make low-level changes to the game engine involved...
    (Particularly the way it detects graphics hardware, and how it configures it's rendering systems.)

    Anyway... Building an engine from the ground up to do this wouldn't be hard. (well, as hard as writing a game engine is regardless), but modifying an existing engine to do it could range from trivial to a complete nightmare, depending on the engine.

    I guess it's something to think about for anyone trying to make VR optimised game engines though...
  • KuraIthys has basically covered it. But I'll add a note: a higher-resolution backbuffer will probably be desirable for quality anyway (if the framerate can be sustained) -- a low-res distorted resampling of a low-res source is part of why we see such bad pixelation and unreadable text. A higher res source, and sharper sampling than bilinear helps.

    Anyway, my point is that we might have a higher resolution backbuffer to present, and not need to render a third view for a monitor.

    However there are some advantages to a third render: using a lower FOV suitable for monitors, and possibly different perspective -- depending on whether you're imagining the use being demoing Rift to outside observers or making a game pleasant to watch for an audience (a 3rd person perspective might be useful for this -- Alone In the Dark style or even chase-cam).
  • Marbas's avatar
    Marbas
    Honored Guest
    Not an engine coder at all. Not sure if Unity support running the game in a side by side monitor setup, otherwise it may be possible to work something out with render to texture or just copying one of the already existing renders to the "other side". Anyways, If not in unity maybe some other engines with rift support that also work well with side by side monitor setups. I dont think you are stuck to low level engine coding for such a display scheme to work. Would definitely be a great feature for those rift-drivers like Vireo to have.
  • Marbas's avatar
    Marbas
    Honored Guest
    You could also have 2 exes running in synk using a network connection on the same PC with the Rift & Monitor in extended desktop mode. Atleast for the sake of a presentation it works. Doing this the proper way, I agree that a low level implementation in the render pipeline would be the best solution.
  • owenwp's avatar
    owenwp
    Expert Protege
    My plan is to have the game show a window on the main monitor that can act as a persistent menu and viewport, somewhat analogous to the sort of display commonly seen on the bottom screen for Nintendo DS games. I will probably put a mirrored game viewport in a small portion of that window, taken from one eye and cropped for a nicer aspect ratio.

    No reason to go fullscreen here, because you get your immersion inside the Rift. And it will be useful to be able to lift up the headset and check skype or whatever without minimizing. It will also probably be more comfortable for a lot of people to do things like change settings and browse server lists outside of the Rift, then put it on when you are in the action. Since most users will have that traditional monitor handy we may as well use it for those point and click interfaces that they excel at.
  • "atavener" wrote:
    KuraIthys has basically covered it. But I'll add a note: a higher-resolution backbuffer will probably be desirable for quality anyway (if the framerate can be sustained) -- a low-res distorted resampling of a low-res source is part of why we see such bad pixelation and unreadable text. A higher res source, and sharper sampling than bilinear helps.

    Anyway, my point is that we might have a higher resolution backbuffer to present, and not need to render a third view for a monitor.

    However there are some advantages to a third render: using a lower FOV suitable for monitors, and possibly different perspective -- depending on whether you're imagining the use being demoing Rift to outside observers or making a game pleasant to watch for an audience (a 3rd person perspective might be useful for this -- Alone In the Dark style or even chase-cam).


    The issue with it wasn't all to do with resolution though. The main problem is actually the aspect ratio; Unless you plan on showing the view from both eyes, or leaving a large part of the screen blank, you now have a situation where you're trying to display an image with an 8:10 ratio, on a display that is probably 16:9, 16:10, or 4:3.

    That's either going to leave much of the screen blank, or result in some really extreme stretching of the image.
    Neither of which is really something you want to do...

    "Marbas" wrote:
    Not an engine coder at all. Not sure if Unity support running the game in a side by side monitor setup, otherwise it may be possible to work something out with render to texture or just copying one of the already existing renders to the "other side". Anyways, If not in unity maybe some other engines with rift support that also work well with side by side monitor setups. I dont think you are stuck to low level engine coding for such a display scheme to work. Would definitely be a great feature for those rift-drivers like Vireo to have.


    The problem is, most game engines won't render to multiple physical displays, or even multiple windows!
    Of course, that's not to say there wouldn't be any.
    Unity in parficular may have implemented functionality for that just because it supports Wii U development (amongst other things) where 2 screens is the norm, rather than something unusual.

    If the engine natively supports rendering to multiple displays and/or windows, then you can probably do it without huge low-level changes.
    If not though, you'll have a really hard time with it, because any final image rendered would then not be something you can show on anything other than the 'main' display. (which would be the rift). Mirroring the rift output wouldn't help, because that's done at an even lower level which the engine has no control over...