Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Nyast's avatar
Nyast
Honored Guest
12 years ago

SDK's Distortion Scale vs 1080p

Reading chapter 5.5.3 of the SDK doc about Distortion Scale:
The simplest solution is to increase the scale of the input texture, controlled by the Scale variable of the
distortion pixel shader discussed earlier. As an example, if we want to increase the perceived input texture
size by 25% we can adjust the sampling coordinate Scale by a factor of (1/1.25) = 0.8.


For the 1280 x 800 resolution of the Rift, a 25% scale increase will rquire
rendering a 1600 x 1000 buffer


The doc speaks about a 25% scaling factor. However, when I run the samples, the StereoConfig::updateDistortionOffsetAndScale calculates a 1.714 factor ( 71% ). That means rendering to a buffer that is 2194 x 1371. That's a pretty significant difference. Was that 25% just an example, or for an older rift version ?

In the consumer hopefully-1080p version, will it require a 71% scale factor too ? Because that'd require rendering to a 3290 x 2056 buffer. The GPU would need to be a beast to render complex scenes like in games at such a high resolution. That's worrying me a bit.

1 Reply

  • tlopes's avatar
    tlopes
    Honored Guest
    I believe that the 25% increase mentioned was given as an example. You should be using something much closer to the Scale value given in the SDK, however you do have the ability to trade performance for quality by going up or down from the given scale value. For instance, I use "100%" scaling in my engine (rendering everything at 2560x1600) and get great looking post-distortion images (of course, my problem is getting it to do that at a steady 60FPS).