Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
plyrek's avatar
plyrek
Protege
10 years ago

Optimizing a 360 photosphere

I am trying to create a simple photo-viewing application in Unity to view 360 photos. My question is about optimizing the quality of the image texture. Does the size of the sphere used in Unity make a difference? Does it need to be sized in reference to the images resolution? I look at the Oculus 360 photo app and their images are so sharp and crisp. How can I get that kind of quality?

Any tips on optimal image resolution or sphere sizes would be greatly appreciated.

5 Replies

  • Anonymous's avatar
    Anonymous
    Oculus 360 Photos use cube maps instead of equirectangular and each face of the cube is directly sampled to a timewarp layer, achieving maximum quality. You can do this (on mobile) in Unity with the latest Oculus Utilities (1.7) by using OVROverlay.cs. You'll need to convert your equirectangular images to cubemaps using an external tool, luckily there are plenty of them out there. 
  • Would cubemaps also be better quality/more effecient for video as well as stills?
  • Anonymous's avatar
    Anonymous
    Generally yes, if your pipeline supports it. It's probably easier for you to output an equirectangular video, but the drawbacks are the inefficiency of pixel usage, especially towards the poles. A cubemap will use 25% fewer pixels than equirectangular, and does not distort at the poles. There is no geometry distortion within the faces of the cube. More info here: https://code.facebook.com/posts/1638767863078802/under-the-hood-building-360-video/
  • I will try and do some resolution tests next week but from the limited info out there for the rift it seems you would be looking at around 4600 pixels in horizontal resolution (so divide by 4 for a single cube map resolution).   I also found sharpening helps and if the viewer (e.g. Whirligig) has sumpersampling that helps a lot as well.