Forum Discussion
jfilmall
12 years agoHonored Guest
Workflow for live action Stereo 3D
Hello everyone,
I'm fairly new to the concept of 360 degree stereo 3D video, so I'm curious as to what is a standard workflow for everyone. Most importantly, the post side of thing. I do have a lot of experience in regular stereo 3D capture though.
Is Video Stitch the best for combining all the video before taking it into a editing timeline?
I've built a rig with GoPros and we're about to start shooting test.
Any info would be greatly appreciated!!
Thanks
Josh
I'm fairly new to the concept of 360 degree stereo 3D video, so I'm curious as to what is a standard workflow for everyone. Most importantly, the post side of thing. I do have a lot of experience in regular stereo 3D capture though.
Is Video Stitch the best for combining all the video before taking it into a editing timeline?
I've built a rig with GoPros and we're about to start shooting test.
Any info would be greatly appreciated!!
Thanks
Josh
4 Replies
- j1vvyHonored GuestI like to use PTGui for the aligning and VideoStitch for the processing.
First I would profile each camera by capturing a pano ensuring the camera is in the NPP to calculate the exact FoV and abcde distortion values. save each camera as it's own lens profile.
I add all images to project.
Set each image as a separate camera and apply the calculated values. This can then be saved as a template for future projects starting point.
Join the left image with normal control points and join the right imaging normal control points.
I also join the left to right with horizontal control points.
I will add a few normal control points at the very horizon to align the two panos.
Optimize until everything lines up. Add and remove control points as needed.
Process the left, process the right. Currently this has to be done by editing the project file. hopefully that changes soon.
When I was viewing the stereo panos with anaglyph glasses I would shift one of the images so the panos aligned near the camera instead of the horizon. But now that I have a Rift I find I need to shift the IPD to view my old panos. So maybe there is a difference between the ways anaglyph and Rift are displayed. Shifting IPD does not work when viewing as anaglyph. - jfilmallHonored GuestThanks for your input jivvy!
I think I understand what your saying, but I've just a few more questions to help with my enlightenment. And, sorry if I sound like a noob!
When you say 'profile each camera,' do you mean take a still image for aligning? If so, can you do this with the first frames of the recorded video footage or frames with matching timecode?
Then, once you have a good pano stitched together, take the alignment metadata into VideoStitch for tweaking and processing of the video footage? (meant to be a question, but its horribly phrased)
Are you doing 3D adjustments in VideoStitch too? Would there be any advantage to using VideoStitch for 3D adjustments over say Final Cut or Premiere with 3D plugging?
Thanks again,
Josh - j1vvyHonored GuestBy profiling I am talking about figuring out the exact Field of View (FoV) and distortion & offset parameters.
The distortion parameters are the abc parameters and account for any barrel or pincushion the image might have different from a perfect lens projection.
The offset parameters are the de parameters and account for slight misalignment that sensor always has from the ideal of the lens center.
The FoV can be slightly different from one camera to the next because of slight focus differences.
By taking a full spherical panorama with lots of overlap is is possible to optimize these parameters. Save these parameters for all future use.
Every video size will need to have it's own setting. You will only be able to use images as apposed to frames out of a video only if they have the exact save FoV and pixel count.
Currently all alignment of the cameras need to be done in third party software, PTGui or Hugin.
Once you have a good stitched set of frames apply those settings to VideoStitch. Use the preview to see if you notice any areas later that have bad misalignment. If so extract those frames and add more control points to the trouble area. Optimize again, Apply settings to VideoStitch, and repeat.
VideoStitch is not 3D aware. I align the left and right at the horizon. I have figured out how many pixels I need to more the pano so the alignment is now at the stereo window. I translate the to degrees. and then every time I create footage at that width I can rotate one set of image that amount. - j1vvyHonored GuestWhen I first started to experiment with 360° S3D I was viewing on a monitor using anaglyph glasses. I was adjusting the stereo separation for this.
Now that I have the DK2 I figured out the stereo alignment for HMD should be at infinity. This makes it easy to do the alignment in PTGui with a few normal control points on the horizon. All other control points connecting left and right pano if converted to horizontal control points will ensure vertical alignment.
I have not updated all my videos for this. The HMD video players for stereo content allows adjusting the stereo.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device