cancel
Showing results for 
Search instead for 
Did you mean: 

VR Claire

FR3D
Explorer
another one - with riftup screens ...

https://vrunion.squarespace.com/claire/
24 REPLIES 24

Christophe
Honored Guest
@kojack
You're raising an interesting point. If I understand properly, what you mean is that the mathematical calculation used nowadays is adapted to flat screens.
Surely there must be a way to generate a new mathematical formula that works with non-flat screens. That would probably be more complex - and therefore slower - than the current formula but I would imagine it would not have a major hit on performance, or would it?

But still, the implication of what you said is that, with such a high fov, you need to update rendering engines used in applications and games. You can't just add (or update) an fov slider to go up to 170 degrees. You need a specific option for curved rendering.
I hadn't thought about that issue, so thanks for pointing this out! 😉

kojack
MVP
MVP
It's something that pissed me off with Eyefinity. I had triple monitors arranged as roughly 140 degrees around me, but any game that accepted that looked crap at the sides (halflife 2, etc). So in my own apps (using Ogre) I used three cameras, one for each monitor.

The first game I ever saw that handled this properly was Fisheye Quake by Wouter van Oortmerssen (creator of Cube, Sauerbraten, Amiga E, etc). It did true spherical rendering with any fov, even 1000+ degrees (multiple wraparounds). It was also vastly slower than normal quake, but it looked awesome. Here's a comparison of conventional quake and fisheye quake at different fovs: http://strlen.com/gfxengine/fisheyequake/compare.html

Here's a quick (crappy) diagram.


The top left is a regular game. We have the near and far clipping planes, and the fov forms a frustum with them. Distances are actually perpendicular to the near and far planes, that's why in game an object that's hidden by distance fog when in the centre appears when you turn a bit to the side, it thinks all points along the far plane are the same distance from the camera.
The top right is getting wider. As the fov widens, the width (and height) of the near and far plane has to increase. As you approach 180 degrees, the width approaches infinity (at 180 the sides will never reach the near plane, they are parallel).
Both of these are how opengl, directx, gpus, 99% of game engines, etc are assumed to run. A standard 4x4 projection matrix to transform camera space to screen space with depth.

What we ideally want for ultra wide fov is the middle left, a curved projection The near and far are spherical. This is easy in ray tracers, then can cast each pixel in any desired direction. Normal engines don't work this way though (as far as I know a projection matrix can't do curved projection).

What I do is the middle right and bottom left. One has 2 regular projection cameras with the same position but different directions. The other is 3 cameras. The more, the smoother the curve and the closer we get to the ideal. In my screenshot above, I used 20 cameras like this. The shader I mentioned can give the effect of curving the projection, but the less cameras you have the less detail you have to work with. Curving one camera isn't good because so much centre detail was already lost. Forget 4K screens, we'd need 40K+. 🙂

The lenses in the headset could compensate for this too I guess, taking a parallel projection and distorting it to appear curved, but we still have massive detail loss in the rendering stage.

Ray tracing also gives us advantages like easy foveated rendering. Unfortunately we still aren't at the point where ray tracing is suitable for heavy game use, without crazy horsepower. Maybe on the next Xeon Phi (72 core intel cpu). 🙂
Author: Oculus Monitor,  Auto Oculus Touch,  Forum Dark Mode, Phantom Touch Remover,  X-Plane Fixer
Hardware: Threadripper 1950x, MSI Gaming Trio 2080TI, Asrock X399 Taich
Headsets: Wrap 1200VR, DK1, DK2, CV1, Rift-S, GearVR, Go, Quest, Quest 2, Reverb G2

Jotokutora
Adventurer
"kojack" wrote:
Having 170 degree fov sounds cool (and it is), but there's going to be a performance hit.
I'm not talking about the fact that wider fov means more of a scene is drawn, so there's less potential frustum culling (this is definitely an extra performance hit, but it's the obvious one). The problem is that typical 3d engines that work using standard projection matrix math (the way gpus like) are unsuited to wide fov. The closer to 180 degrees they get, the more distorted the view is, with excessive resolution devoted to the peripheral to the detriment of the centre. 180 degrees is impossible with this style (the camera frustum would become a flat plane instead of a pyramid).

In order to compensate for this extreme distortion, alternative rendering styles are needed. One way is to render multiple cameras per eye then use a shader to curve them. Two cameras may be enough (for example each eye has an 85 degree fov camera facing 42.5 degrees to the left and 42.5 to the right, for a total of 4 cameras. That will give you 170 degrees with much less distortion). Or we need to use techniques like ray tracing, which doesn't suffer from this at all, but is much slower.

Here is a quick scene I just made. It's a group of ninjas standing in a perfect circle around the player. The ninjas are evenly spaced and they are all exactly 5 metres from the player.
The top pic is a 170 degree fov using the standard rendering technique. This is what Ogre, UE4, Unity, etc would show if you just set the fov.
Look at the size difference between the ninja on the far left and the ninja in the very centre. The left one is 344 pixels wide (before I scaled the pic down for the forum), the middle one is 8 pixels wide! But we want more resolution in the centre and less at the sides.
The bottom pic is the exact same scene with a 170 degree fov, except this time rendered using 20 cameras, each 8.5 degrees across.
Of course 20 cameras is a bit extreme (hey, Ogre can handle it), I could have achieved a similar effect with just 2-4 cameras and a shader to unwrap their views, I just don't feel like writing the shader right now (just got home from work).

So if DK2 struggles on many people's computers with one camera per eye, imagine the performance of the Claire with two or more cameras per eye and double the horizontal resolution.
(That's assuming you don't want the image to suck).




Didn't 'Road to VR' reviewed a dual screen 210 degrees HMD, and performed well if I recall?

kojack
MVP
MVP
"Jotokutora" wrote:
Didn't 'Road to VR' reviewed a dual screen 210 degrees HMD, and performed well if I recall?

Yep, the Infinieye is 210 degrees.
And they do exactly as I described. Here's the Road To VR interview with diagrams showing how they had to use 4 cameras (two per eye) and a distortion shader to support that fov.
http://www.roadtovr.com/infiniteye-technical-qa-high-fov-virtual-reality-work/2/
Theirs is a bit more optimal than my arrangement though, they have a 90 degree fov camera facing forwards on each eye, then a smaller fov camera facing sideways. This gives less overlap in the areas our eyes can't see. My example was just the base idea.
Author: Oculus Monitor,  Auto Oculus Touch,  Forum Dark Mode, Phantom Touch Remover,  X-Plane Fixer
Hardware: Threadripper 1950x, MSI Gaming Trio 2080TI, Asrock X399 Taich
Headsets: Wrap 1200VR, DK1, DK2, CV1, Rift-S, GearVR, Go, Quest, Quest 2, Reverb G2

Jotokutora
Adventurer
"kojack" wrote:
"Jotokutora" wrote:
Didn't 'Road to VR' reviewed a dual screen 210 degrees HMD, and performed well if I recall?

Yep, the Infinieye is 210 degrees.
And they do exactly as I described. Here's the Road To VR interview with diagrams showing how they had to use 4 cameras (two per eye) and a distortion shader to support that fov.
http://www.roadtovr.com/infiniteye-technical-qa-high-fov-virtual-reality-work/2/
Theirs is a bit more optimal than my arrangement though, they have a 90 degree fov camera facing forwards on each eye, then a smaller fov camera facing sideways. This gives less overlap in the areas our eyes can't see. My example was just the base idea.



Thanks Kojack, I enjoyed reading your illustration on the subject. I personally would like to see a wide FOV HMD. Although I do not deny the professional claims given by Oculus and Google regarding the difficulties of WFOV, I am in support of outside independent developers to push the boundaries further.

By now you have noticed out that most every new VR unit is pretty much is based on the Oculus single 90-110 FOV design. Except the VRelia's/Immersion Pro, which use dual vertical display. My issue with their unit is that they seem to waste lots screen real state. They claim 130 FOV which I am curious to try, and they have partnered with Tiger Direct for distribution, so I may pick one up to try.

I will keep an eye on this Claire unit although will wait for actual reviews first and them lowering their price also.