Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
Sporky's avatar
Sporky
Explorer
13 years ago

How about doing the distortion in hardware?

If the barrel distortion were done in the Rift, the device would suddenly be compatible with all sorts of existing 3D sources.

7 Replies

  • This may work. But I could see this causing delay or lag from the display to the controller. It could also up the cost.
  • Indigo's avatar
    Indigo
    Honored Guest
    Honestly. I don't think it's worth it. As far as I know the barrel distortion is just a matrix multiplication on the frame buffer. (Well 2 really. Since we are rendering 2 points of view)

    Compared to the rest of the video pipeline, it really doesn't add any significant delay.

    Also consider that your GPU is likely orders of magnitude faster at doing these 2 multiplications than any chip the oculus team can add.

    Maybe someone with a better understanding of video pipeline delays and bottlenecks can care to comment?

    I'd recommend keeping the transformation in software, simply because it makes it easier to modify for different users, eye cups, game specific effects, after-market lcd screens, etc.

    Although... this WOULD solve some of the problems to do with spectating. Since users observing while not wearing the rift, would be able to make more sense of what's going on...

    Humm....
  • The barrel distortion is a per-pixel operation that moves the pixels in the source image to a different location in the final rendered image.

    I made a rough measurement of it's effect on a really slow system (averaging about 20 fps), and it added about 10% to the rendering time, though I suspect it would be less of an issue for faster systems.

    I do have serious doubts you could create dedicated hardware to do it without spending a lot of money on it though, which seems kind of pointless.

    It's effect on rendering is not trivial, but it's not so large that the expense of dedicated hardware would really be worthwhile.
  • Doing it in software is absolutely fine IF someone is willing to write or re-write the game or other application to accommodate it. That's going to leave out a lot of existing stereoscopic content. What about stereo telepresence rigs and the like? The Rift isn't the first stereoscopic device. There is lots of existing content and a number of existing stereoscopic formats that would be compatible with it if special code didn't have to be added to deal with the distortion. Doing it in hardware might not be much of a problem or expense. It would certainly make the Rift useful for more applications. Perhaps it's worth looking into before dismissing the idea.

    Any chance some tricky optics could eliminate the need for the distortion code?
  • even doing everything in Hardware, the Game would still need to be integrated with the ability to display the VR perspective, either rendered with 2 camera's or adding the SDK to give the left right eye perceptions.
  • Complicated optics being too expensive is the whole reason the rift is built the way it is... :roll:

    I recall Ati drivers used to have the ability to apply shader effects to full-screen images in the driver though.
    Since it was forced in the driver, the shader was applied regardless of what a game did.
    Anything that actually used the 3d hardware, would have the selected shader applied.

    Overall, getting something like that implemented would be far more practical than trying to get the rift hardware to do it.