Showing results for 
Search instead for 
Did you mean: 

How to fix the warping and grain issue on the Quest headset


Each camera can work standalone. The time-of-flight sensor does not need to be used to calculate distance or depth. The left camera sends raw image data to the left display and the right camera sends raw image data to the right display. Of course, this would work if the cameras on the front of the headset have normal lenses. If the lens is a fisheye lens, then the left camera would send data to be flattened onto a semi-spherical virtual display. Each camera would get its own semi-spherical virtual display. (Think of Soaring of California ride from Disneyland... that but in VR for the passthrough). The image data would need to be upscaled in real-time before the data can be sent to the virtual semi-spherical display. The brain will be able to automatically infer distance from the two images/videos being side by side. 

As a result, this will fix two issues with one stone. The first issue this will fix is the warping and the second is the grain in low light conditions.