Forum Discussion
geekmaster
12 years agoProtege
Creating fisheye views with the Unity3D engine
I have been posting for some time here and at MTBS3D that we really need to stop doing the Rift pre-warp so late in the rendering pipeline (which adds more distortion artifacts of its own, that people...
geekmaster
12 years agoProtege
"kojack" wrote:
The real Right way to do it would be with raytracing, since then you can actually render a fisheye view without needing to apply distortions to render output (the current method people are using: render the scene into a texture then apply a distortion shader. Your proposed method: render the scene into four textures forming the sides of a cube map then apply a distortion shader).
I did a similar thing back in 2009 in Ogre when I got a triple monitor eyefinity system running. I used multiple cameras rendering into a cubemap then using a shader to unwrap it into a 180 degree 5200 x 1050 cylindrical panorama. I used cylindrical instead of spherical because the vertical fov was so small on a 48:10 screen, but going to spherical would be easy.
You could probably get rid of some of the performance hit of scene management in a multiple camera system by using geometry shaders to clone geometry onto multiple render targets rather than re-rendering the same geometry for each camera. I haven't played around with that yet.
Actually, my preferred method IS raytracing (including one of those local projects mentioned above). And just like Paul Bourke's older 2004 article recommends (including the source code for POV-Ray fisheye camera lensing). You should check out the real-time Rift ray-tracer thread here. It uses an in-game fisheye lens.
The only thing special about the method in THIS thread is that it supports Unity3D, without any hacks or mods or DLL-injection techniques. Unless I missed something (I only discovered Paul's newer post just before starting this thread). His proposed method should eliminate the pre-warp artifacts we now have. Or, maybe we need to just do it all from scratch, designing the games themselves to support in-game fisheye lenses (my REAL recommendation). Using existing Unity3D is just a way to get started on the road to pre-warp quality improvements.
Paul's newer method WILL be an improvement, right?
So yes, I agree with you (and I always did), that the REAL "right way" is ray tracing (or better, path tracing), but those do have some extra hardware requirements (which will be much more common before long). Too bad that multi-core processing tends to INCREASE latency, when that is exactly the direction our newer "faster" computers are heading. Perhaps we need to migrate more of our game-engines into the GPU shaders (like what can be seen at shadertoy.com).
So, perhaps this PARTICULAR example of Paul's is not quite what we need, but I believe his older 2004 post really does show us the way. And even modern game engines SHOULD be able to render using a fisheye lens on their virtual cameras (the REAL intended point of this thread).
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 7 months ago
- 2 months ago
- 11 months ago
- 10 months ago