Forum Discussion
pittsburghjoe
11 years agoProtege
Nozon (Are they doing anything special?)
http://www.uploadvr.com/nozon-debuts-interactive-parallax-for-360-degree-videos-with-presenz/
http://nozon.com/immersive-movies-faq
Are they doing something currently unachievable?
I am reading into it to be this: they film a location with one of these rigs http://www.panocam3d.com/ import the 360 stereoscopic video into a custom game engine, then add objects and animated characters in the environment.
If they are working with light fields ..then I will be impressed.
http://nozon.com/immersive-movies-faq
Are they doing something currently unachievable?
I am reading into it to be this: they film a location with one of these rigs http://www.panocam3d.com/ import the 360 stereoscopic video into a custom game engine, then add objects and animated characters in the environment.
If they are working with light fields ..then I will be impressed.
12 Replies
- pittsburghjoeProtege
3 How is PresenZ different from standard 360 degrees stereoscopic movies ?
Parallax:
Standard 360° movies are taken from a single point of view, so they cannot support parallax. When the viewer shifts his head to see what’s behind an object, the perspective doesn’t change. This quickly breaks the immersion feeling and brings discomfort and even cybersickness.
Our technology, PresenZ offers the same image quality than 360° movies but also allows for interactive parallax. It makes it possible to see an object from different angles or get closer and see the details of it.
But more importantly, it truly gives you the feeling of being present in the scene and avoids cybersickness.
Stereocopic quality: IPD, roll and poles
The stereoscopic versions of standard 360° movies, have two pre-recorded images, one for the left eye and one for the right eye, thus allowing a depth feeling.
Since PresenZ allows to shift the viewing point we have perfect stereoscopy by design. At playback time, we just create two different viewing points corresponding to the left and right eyes of the viewer.
So with PresenZ, the Interpupillary Distance ( IPD ) can be changed in playback to match the viewers IPD, while with standard 360° movies, this value is fixed and “baked” in the movie.
With PresenZ we also have correct stereoscopy when the viewer rolls his head. With standard 360° movies, the stereoscopy is “baked” for a perfectly horizontal head and cannot take into account any head roll.
Another problem of the standard stereoscopic 360° movies are the sphere poles. When looking completely up or down, there is a pinch problem. Again with PresenZ, the stereoscopy is perfect even for the poles.
>Those differences allows PresenZ to offer a much more immersive experience than standard 360 stereoscopic movies. And above all, it avoids cybersickness and makes the experience very comfortable. - pittsburghjoeProtege
"Standard 360° movies are taken from a single point of view, so they cannot support parallax. When the viewer shifts his head to see what’s behind an object, the perspective doesn’t change. This quickly breaks the immersion feeling and brings discomfort and even cybersickness."
well of course you can look around 3d objects you add to a vr engine. If they are importing 360 stereo backgrounds ..that's not going to change very much as it is far away from you. This part isn't anything new yet.Our technology, PresenZ offers the same image quality than 360° movies but also allows for interactive parallax. It makes it possible to see an object from different angles or get closer and see the details of it.
again ..parallax with 3d objects you are adding to the scene ..not the background video.The stereoscopic versions of standard 360° movies, have two pre-recorded images, one for the left eye and one for the right eye, thus allowing a depth feeling.
Since PresenZ allows to shift the viewing point we have perfect stereoscopy by design. At playback time, we just create two different viewing points corresponding to the left and right eyes of the viewer.
okay, so they they render out something exactly like Unity and Unreal handle for ipdWith PresenZ we also have correct stereoscopy when the viewer rolls his head. With standard 360° movies, the stereoscopy is “baked” for a perfectly horizontal head and cannot take into account any head roll.
They know we can export videos in real stereoscopic from 3DS Max, Maya, and Bender ..right?Another problem of the standard stereoscopic 360° movies are the sphere poles. When looking completely up or down, there is a pinch problem. Again with PresenZ, the stereoscopy is perfect even for the poles.
um congrats on the stitching software? - ErikCarettaHonored GuestThis is interesting.
Two ideas:
1. They may be doing what in CGI and VFX is called "baking". Basically they pre-render the scene but then it gets re-projected on the geometry: in this case you still need a realtime rendering, but it's way faster because most of the lighting\shading calculations are already made.
2. They may be doing something more complex: rendering the scene from a lot of point on a 3D grid inside the zone of view they speak about. Then, in the realtime viewer, they could load the right render based on the camera position (of course interpolating between the 8 nearest pre-rendered position)
Just the first ideas I had, I may be completely wrong :) - pittsburghjoeProtegeAll game engines use baking. You might be on to something with your second suggestion, you are talking about lightfields and not knowing it.
- ErikCarettaHonored Guest
All game engines use baking.
True. But on their website they say:
"at render time, a new 3D scene is created around the viewer for each frame of the 360°movie"
that made me think there's some geometry involved, and thus some kind of baking.
But after I watched their presentation video I doubt it.You might be on to something with your second suggestion, you are talking about lightfields and not knowing it.
I was suspecting it :)
I've heard several times about light fields, but never had the time to read something about it. I'll need to do it.
Conceptually it is quite easy, the problem is the computational power required to generate such a massive amount of data, and to sort it in realtime as needed. - TwitchmonkeyExplorerWell, anyone here near Brussels? Apparently all you have to do is call them up and they'll give you a demo, but the plane trip makes this a difficult prospect for me.
- gaelhonorezHonored Guest
"Twitchmonkey" wrote:
Well, anyone here near Brussels? Apparently all you have to do is call them up and they'll give you a demo, but the plane trip makes this a difficult prospect for me.
Where are you based? We are planning to go to more meet-up around the world. We are only aware of "important" ones (we did demos at SF uploadVR and SVVR this month).
We won't tell how we do it, but it's not geometry baked on low poly. Firstly because it's not new or revolutionary, it's quite low tech, and that would defeat the purpose of not being limited by the complexity of the scene. - TwitchmonkeyExplorerI'm out here in Orange County, California. We've got a pretty solid community out here in Oculus country and I'm sure lots of folks would love to try out your demo, but I could understand if you want to get outside of California, we're pretty spoiled over here. I wasn't aware you had demoed at SVVR as I wasn't able to make it out there, I'll see if I can find any impressions. What you're doing certainly sounds intriguing, but as with everything in VR it really needs to be seen to be appreciated.
- gaelhonorezHonored GuestWe will probably go back near SF at some point in the future. We will hopefully be able to announce other stuff soonish, follow our twitter account if you don't want to miss them :)
- FredzExplorerI guess Nozon uses a similar technique to the one from Frooxius : http://www.reddit.com/r/oculus/comments/2qjqr3/i_got_lightfield_synthesis_and_rendering_working/
I'd love to see this with interesting content.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 1 year ago
- 1 month ago