Forum Discussion
muglore
11 years agoExplorer
Building a custom ICT Lightstage rig
I've been following the advancements with stereo 360 video recording and been impressed with the progress that everyone is showing with developing realtime spatial capture. One limitation that I've seen of current stereo lens systems that are most commonly used is that the fixed focal point and lens convergence restricts the viewer to a fixed point in space.
This creates two problems with re-creating the environment in VR. One problem is that because the convergence is fixed, the capture looks fine so long as the viewer observes the recording from the same angle that matches that of the camera used to record. This creates problems with true presence as the viewer in realtime VR will never remain perfectly stationary. The other problem is that it is very difficult to match multiple angles in one complete seamless stiched image. Partially because there are minuscule calibration variations between cameras as well as differing convergence between stereo pairs pointed in 90 degree increments with normal lens distortion. This defects are visible now with dk1 and will only become more apparent with dk2 positional tracking.
I was thinking if there is a way to make a rig similar to the ICT labs Lightstage to record realtime 3d that could then be played back in VR with more dynamic results. If you're not familiar with the ICT's lightstage process check out this video.
http://m.youtube.com/watch?v=UUvAVjUnE8M
Does anyone have any other ideas for more dynamic methods for 3d spatial recording setups?
This creates two problems with re-creating the environment in VR. One problem is that because the convergence is fixed, the capture looks fine so long as the viewer observes the recording from the same angle that matches that of the camera used to record. This creates problems with true presence as the viewer in realtime VR will never remain perfectly stationary. The other problem is that it is very difficult to match multiple angles in one complete seamless stiched image. Partially because there are minuscule calibration variations between cameras as well as differing convergence between stereo pairs pointed in 90 degree increments with normal lens distortion. This defects are visible now with dk1 and will only become more apparent with dk2 positional tracking.
I was thinking if there is a way to make a rig similar to the ICT labs Lightstage to record realtime 3d that could then be played back in VR with more dynamic results. If you're not familiar with the ICT's lightstage process check out this video.
http://m.youtube.com/watch?v=UUvAVjUnE8M
Does anyone have any other ideas for more dynamic methods for 3d spatial recording setups?
1 Reply
- ElectricMucusExplorer
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 9 months ago
- 2 months ago
- 2 months ago
- 2 months ago