Forum Discussion
pittsburghjoe
11 years agoProtege
The Foundry just solved live action stereoscopic VR 360
8 Replies
- pittsburghjoeProtegestart watching the video at 3:39:00
- mediavrProtegeThere are some innovations there but not fundamental solutions. Tracking --- it can solve camera positions, orientations and internal parameters of a ring of Gopro cameras. This is good but not unique. Photoscan3d (pro version), for example, if their product description is accurate, can solve fisheye array images like this (and do a scene reconstruction and create a panorama from a nodal subset of images).
http://www.agisoft.com/features/compare/
Also in the Nuke process it seems to solve the cameras from scratch each time whereas a more trustworthy process and one that would be better for creating reusable templates maybe would be to determine the lens/sensor parameters of each camera separately (Gopros vary a lot, for example in the positioning of the sensor behind the lens). You can do this calibration with an accurate indexed panorama head and PTGui or Hugin. It would be nice to be able to input this information into Nuke (or use Nuke to do this individual calibration via panorama head) -- and then go from there to calibrate the positions and orientations and fine tune the individual camera internal parameters of the camera array.
Then he talks about stitching. He shows how each of the stereo panoramas from the L/R pairs have geometric -- ie parallax (and sync presumably with this rig) stitching errors which can be corrected automatically by Nuke's warping capabilities. But the L and R stitching errors (speaking parallax errors only) will likely be in different locations in the L and R panoramas and there is no guarantee (at least he didnt mention that they are handling this) that the blending algorithms will act in stereoscopically identical fashion.
Then he talks about how correct stereo for zenith and nadir is impossible without a textured mesh of scene geometry (or by using depth maps to reduce nadir and zenith depth to zero -- infinity in vr headset contexts). In fact Micoy at least claims correct nadir and zenith stereo viewing is possible (they sell plugins for full dome stereo rendering and have patents for a spherical camera array where images are stitched in spiral patterns).
http://www.micoy.com
http://www.mtbs3d.com/index.php/article ... aging-Tech
http://www.trentgrover.com/images/tech/ ... oyComp.pdf
(the Trent Grover article has swapped captions for his zenith anaglyph stitching illustrations I think)
Also the Iximage sector approach to "omnipolar" fisheye image stitching seems to me to have distinctly superior nadir/zenith stereo compared with normal camera array geometry rigs (rings or spheres of cameras). - AnonymousWe have the single presentation posted now on Vimeo so you don't have to go through the large live stream file.
https://vimeo.com/125572049Also in the Nuke process it seems to solve the cameras from scratch each time whereas a more trustworthy process and one that would be better for creating reusable templates maybe would be to determine the lens/sensor parameters of each camera separately (Gopros vary a lot, for example in the positioning of the sensor behind the lens). You can do this calibration with an accurate indexed panorama head and PTGui or Hugin. It would be nice to be able to input this information into Nuke (or use Nuke to do this individual calibration via panorama head) -- and then go from there to calibrate the positions and orientations and fine tune the individual camera internal parameters of the camera array.
This is just a tech preview. We will have a lot more to talk about in the future on this. You don't have to re-solve anything if you don't want to. Nuke is just a DAG with a series of nodes and metadata. Each set of nodes is very modular and re-usable. You could create the entire rig in Nuke manually if you wanted without solving or import an fbx/alembic file with a camera setup from another application. Once you have solved for one part of the pipe you could use it for all shots and not re-solve it each time. Camera rigs for VR(especially gopro ones) are very flexible, not always precise and the geometry is often changing so many times you do want to re-solve for it if you are removing any cameras to change out batteries, etc... It is up to you though.
Previously we created multi camera tracker already built into Nuke for many years now which can solve for stills or a moving camera and can do point clouds, image based modeling, etc....
https://vimeo.com/94369259
https://vimeo.com/album/3087027Then he talks about stitching. He shows how each of the stereo panoramas from the L/R pairs have geometric -- ie parallax (and sync presumably with this rig) stitching errors which can be corrected automatically by Nuke's warping capabilities. But the L and R stitching errors (speaking parallax errors only) will likely be in different locations in the L and R panoramas and there is no guarantee (at least he didnt mention that they are handling this) that the blending algorithms will act in stereoscopically identical fashion.
We already produce tech for correcting stereo 3d images called "Ocula" which we created originally for Avatar about 7 years ago. It has has 5 iterations since then and used on hundreds of films. It solves geometry of cameras and does color matching, alignment repair, etc....
http://www.thefoundry.co.uk/products/ocula/
https://vimeo.com/album/2985109Then he talks about how correct stereo for zenith and nadir is impossible without a textured mesh of scene geometry (or by using depth maps to reduce nadir and zenith depth to zero -- infinity in vr headset contexts). In fact Micoy at least claims correct nadir and zenith stereo viewing is possible (they sell plugins for full dome stereo rendering and have patents for a spherical camera array where images are stitched in spiral patterns).
Not impossible but problematic so it is easier to fade out the depth at the poles.
Do you have working links for those docs? The pages appear to be down and I can't find an alternate source for the paper on google. - mediavrProtege
Camera rigs for VR(especially gopro ones) are very flexible, not always precise and the geometry is often changing so many times you do want to re-solve for it if you are removing any cameras to change out batteries, et
Yes this is usually the case with Gopro stereo 3d printed 360 rigs and it is unfortunate but it is possible to have accurately repeatable camera holding arrangements even with Gopros.Do you have working links for those docs? The pages appear to be down and I can't find an alternate source for the paper on google.
... they are working for me just now .. the Micoy home page is distinctly uninformative but their patents are easy to find .
Their tech is used in the Marvel travelling dome road show I think currently.
http://www.micoy.com
http://www.mtbs3d.com/index.php/article ... aging-Tech
http://www.trentgrover.com/images/tech/ ... oyComp.pdf
http://www.iximage.com/ - AnonymousWill Nuke take into account the stereo aspect when seaming to minimize stereo disparity seams or is it simply seaming each panorama independently?
Does Nuke address virtual stereo rigs (stereo using L/R of alternate cameras) or is it strictly for stereo paired camera systems?
Any specific tools to deal with GoPro rolling shutter and sync issues?
Does it deal with sliced or line cameras systems?
Not sure why users have to rotate the image to roto and paint in a flat manner and then re-rotate, getting a quality loss in the process. Can't Nuke simply provide a viewing distortion to correct this and modify the paint and roto points accordingly? (i.e. changing it only for display, warp brush shape, etc)
Any ETA or at least a rough quarter estimate when these tools will be available? - AnonymousHi Scott
Sorry I just saw this now.
One reason we put this video out is to get feedback from customers about which other workflows & tools we are missing. Any other tools you need for VR just let us know so we can consider them in the toolset. Some of the questions I can't fully answer as the toolset is still in flux and I don't want to make statements publicly about tools that may change.Does Nuke address virtual stereo rigs (stereo using L/R of alternate cameras) or is it strictly for stereo paired camera systems?
It shouldn't matter.Any specific tools to deal with GoPro rolling shutter and sync issues?
Not as far as I know. I need to talk to our devs about including our rolling shutter plugin as we discontinued it but we still have the tech.Does it deal with sliced or line cameras systems?
I will get back to you about this.Not sure why users have to rotate the image to roto and paint in a flat manner and then re-rotate, getting a quality loss in the process. Can't Nuke simply provide a viewing distortion to correct this and modify the paint and roto points accordingly? (i.e. changing it only for display, warp brush shape, etc)
I'm not sure what you are referring to. You do not need to do this. Jon was explaining in the video that this is what causes mass filter hits with doing VR in Nuke with the existing out of the box toolset. These new tools should alleviate this.Any ETA or at least a rough quarter estimate when these tools will be available?
We can't say publicly at this time. You can fill out the form on this page to request access:
http://www.thefoundry.co.uk/solutions/virtual-reality/ - AnonymousThanks Deke.
- AnonymousA few people have msg me on linkedin in recently. I forgot I made this post a while back and left my email here. I'm not at the Foundry anymore so I removed my email from the post. You can fill out this form to register interest in the Foundry VR tools:
http://www.thefoundry.co.uk/solutions/virtual-reality/
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device