01-28-2024 09:58 PM - edited 01-28-2024 10:00 PM
I see no mention of this anywhere but the quest 3 lenses have less pincushion distortion so therefore in theory, the rendering resolution required to achieve native panel resolution is much less after distortion correction?
This image from the population one developers is pretty good at showing how the 2 headsets differ in how the images need to be distorted.
The reason I ask is the setting for godlike on virtual desktop is just a scaling of the q2 resolution's requirement which is certainly not correct? Or is it correct and the pancake lenses just result in a higher pixels per degree than you would otherwise get with fresnel?
hard to tell because center lens distortion looks similar with pancakes having a clear difference on the edges
01-29-2024 03:20 AM
I need to write a native app to display the values, but using the SDK with AirLink to request the needed resolution to get 1:1 screen pixels to rendered texels in the centre of vision, I get:
Rift-S: 1648 x 1776 (scale factor from native: 1.29 x 1.23)
Quest 2: 2704 x 2736 (scale factor from native: 1.48 x 1.43)
Quest 3: 2064 x 2240 (scale factor from native: 1.0 x 1.01)
Airlink could be messing with those last two, I need to look deeper. But it does make sense that the Quest 3 lenses mean we don't need the higher res to hit panel res after distortion.
01-29-2024 08:33 AM - edited 01-29-2024 08:35 AM
How did you come about these numbers? Quest 2 definitely matches the claims from meta but quest 3 does look to have some barrel distortion in the center from the image above.
If this is the case and it is only 1.0x1.01 then the quest 3 is easier to run than the quest 2 and everyone is supersampling too much in virtual desktop godlike mode?
I can say in native titles like eleven table tennis with in game settings I see no difference past 150% but the game does not exactly have high resolution textures. That would be 2520 horizontal per eye if 150% is just a multiplier of in game rendering resolution defaults?
The other issue with air link and oculus link is the encode width is limited to 4032 or is it? Like say I set it to 4128 does it just only show 4032 in the app but encode at a higher width?
01-29-2024 09:02 AM
For example on the quest 2 I saw less differences above 5108. Almost as if it is a logarithmic scale where as you get closer to the required value it is not far off and you only get small gains.
01-29-2024 03:33 PM
@echo.sonic.x wrote:
How did you come about these numbers?
I called the ovr_GetFovTextureSize function in the Oculus SDK with a pixel density of 1.0. That makes it give you the required resolution per eye to have 1 pixel rendered match 1 pixel of the physical screen in the centre of the lens.
As I said though I don't trust the numbers 100% because Airlink may be messing with it. The Quest 2 has a believable higher res, the Quest 3 might be cheating. I'd rather get the data natively on the headset, but that's more annoying (I already have a PC app I wrote: OculusMonitor).
The other thing I can do (but don't already have code for it) is extract the distortion mesh. This is the thing that stretches the centre pixels. That might be useful.
01-29-2024 07:38 PM
Do you know if it's like a logarithmic increase where the amount of supersampling increase (over native) gives only marginal increases in the center until you hit the necessary 1:1 pixel ratio? Because on the quest 2 I found in native games 170% was more than enough sharpness which is far below the 5408 that oculus link needed
01-29-2024 08:48 PM
It depends on where (or by who) the scale is being applied.
In the Oculus SDK, the scale factor is linear for each axis, which means the pixel count is exponential.
If the panel res was 1000x1000 and you had 200% scaling with the Oculus SDK (or 2.0 pixel density), the game would render at 2000x2000 (4 times as many pixels) and then warp that to fit into panel (downscale to 1000x1000 but bulging the centre to keep the quality there).
In SteamVR, the scale factor is linear with pixel count and logarithmic with axis resolution. 200% scaling would mean 1414x1414 (2 times as many pixels). The square root of the scaling applies to each axis.
If it's a game's internal scaling factor, it could be anything. Games can ignore all of these and pick any resolution they want.
For Link, it's also dealing with a compressed (H264 or H265) video stream with encoding quality loss compared to direct lossless rendering for native apps, so it may need higher res to make it look good.
01-31-2024 11:05 PM
When I was talking about logarithmic increases I was thinking about it like this.
Quest image is distorted to match profile of lens. If you render at native panel res the distortion in the center causes a scaling which results in a loss of sharpness. But because of whatever image filtering is used, it's not a 1:1 loss of sharpness. Therefore I was thinking that while you have to SS to hit a certain native res this is not a 100% requirement as beyond a certain point you may get close enough.
compared to direct lossless rendering for native apps, so it may need higher res to make it look good.
In this case you would need to supersample the video which link does not even encode at the native res of the quest 3. You get a noticeable aliasing because of it