Forum Discussion
mdelucasschellg
11 years agoExplorer
Rendering different cameras
So before SDK 0.4.1, our team was working on a game that required rendering additional views with the Oculus's.
We were able to get this to work with the previous SDK by extending the desktop from one monitor to the Oculus and then moving the two Oculus eyes to the right of a large viewport window, the left of which rendered the other views. This, admittedly was a bit of a hacky solution and possibly not even viable as we were now rendering 4 times instead of 2 nor could we guarantee users would do this.
With the new SDK and render target setup though, even this is seeming impossible as these render targets are ALWAYS rendered on top, and you can't resize the viewport window. At least when some experiments in the Tuscany demo, I found trying to do so caused crashes and other problems.
I've been researching, and the only solutions -- that doesn't involve writing our own Unity / Oculus SDK -- is to network the game -- Oculus user on one machine connecting to the other machine -- but we're still frustrated that updates to the SDK have made our idea, again, seemingly impossible.
However, we've only had our DK2 for a day, so maybe there's something obvious that I've missed which is why I decided to post this here.
Thanks
We were able to get this to work with the previous SDK by extending the desktop from one monitor to the Oculus and then moving the two Oculus eyes to the right of a large viewport window, the left of which rendered the other views. This, admittedly was a bit of a hacky solution and possibly not even viable as we were now rendering 4 times instead of 2 nor could we guarantee users would do this.
With the new SDK and render target setup though, even this is seeming impossible as these render targets are ALWAYS rendered on top, and you can't resize the viewport window. At least when some experiments in the Tuscany demo, I found trying to do so caused crashes and other problems.
I've been researching, and the only solutions -- that doesn't involve writing our own Unity / Oculus SDK -- is to network the game -- Oculus user on one machine connecting to the other machine -- but we're still frustrated that updates to the SDK have made our idea, again, seemingly impossible.
However, we've only had our DK2 for a day, so maybe there's something obvious that I've missed which is why I decided to post this here.
Thanks
3 Replies
Replies have been turned off for this discussion
- mdelucasschellgExplorerOr I guess a better question is, is it possible to set "UseCameraTexture" to false and still have it work. I tried, but essentially you have to combine 0.4.1 and 0.3.0, which is rather messy.
Essentially, we want the positional tracking but the render target setup is what is setting us back. - ryahataHonored GuestHave you taken a look at my forum post?
https://developer.oculusvr.com/forums/viewtopic.php?f=37&t=11707
There are other people who had/want the same functionality I think you're describing. Is networking out of the question? If you're not doing complicated things then sticking network views on all the relevant GameObjects is a pretty quick way to go. Just make sure to dynamically clear out any script that should not run on clients (e.g. OVRCameraController). - mdelucasschellgExplorerThanks for the thread link. I wouldn't say it's completely out of the question. It's not the ideal solution we were really looking for, but it will probably be the next step unless we get more information on getting around this.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 year ago
- 8 months ago
- 2 years ago