Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Syley's avatar
Syley
Honored Guest
10 years ago

Streaming/Relaying oculus information

Hi guys, new oculus developer here and I just had an idea I was tossing around. The idea however requires that I be able to use one computer kind of as a server, used for the intense processing and then streaming or relaying that to a less powerful PC and displaying it to the OR.

I'm not quite sure if this is possible, I know in one of the previous runtimes it was possible to just use the rift as an extended monitor. In that situation it might have been doable but now it seems as though it integrates the drivers more with the newer runtime and does not allow it to be used as a monitor. This has benefits in performance etc but it seems that now it might be more abstracted? The lower level stuff will be hidden and just done on the drivers then sent straight to the HMD.

Also another consideration, I'm assuming with a reliable and strong wifi connection that streaming should hopefully be fast enough but I'm not quite sure. It's possible that there could be spikes or drops here and there which would cause stuttering. Anyone have any experience with this?

I'm not quite sure if this is possible or if my thoughts are correct so if anyone could comment or help out in any way that would be much appreciated.

2 Replies

  • It should be possible to write a client that would receive the streaming video and copy the frames it receives onto the SDK-allocated textures for direct mode display. However I think it's likely that the latency inherent in streaming video would make it completely unusable for VR.
  • Syley's avatar
    Syley
    Honored Guest
    Thanks for the insight. So it does look like it may be technically feasible but highly unlikely. There's a few things which still look a bit shaky after having researched further. My basic idea was this, have the bulk of the game or app processed on a desktop PC or server while also having a rift hooked up to an embedded PC like a raspberry pi. The content would then be streamed over local wifi to the enclosed unit. This way you could try tuck away all the cables and the pi in a sort of enclosed unit that you may be able to wear and walk around with hands free.

    Now it seems like linux support has been dropped for the rift and from my research it doesn't seem like there are any small portable computers that can run windows (most seem to have ARM architecture). In addition to this the local wifi would need to be sending enough information for the two 1080p screens of the rift from the server as well as the enclosed unit relaying back movement and other sensory data. I'm not quite sure if there is any encoding on the video signal but it SHOULD technically be able to transmit that information on current wifi standards at least under ideal circumstances without lag. If however there is stuttering that would lead to a bad VR experience and possible side effects such as nausea and dizziness.

    That's basically what I've gathered from researching this idea, does this seem roughly correct in the line of thinking or are there things that I'm potentially missing? To me it looks like this idea is not worth pursuing at the current moment but criticism or suggestions are much appreciated.