cancel
Showing results for 
Search instead for 
Did you mean: 

Cuda Pixel Shader for Distortion

RWEye
Honored Guest
Has anyone looked into using Cuda for the distortion correction / pixel shader? Would it be worth the effort?
2 REPLIES 2

jherico
Adventurer
"RWEye" wrote:
Has anyone looked into using Cuda for the distortion correction / pixel shader? Would it be worth the effort?


Cuda is an API for accelerating general purpose computing tasks using parallelism available various hardware, usually GPUs (actually Cuda may be GPU specific) and is similar to an open standard called OpenCL. However, the distortion shader isn't general purpose computing. It's sole purpose is to look up offsets in a texture that you will then render to the screen, and since it's written in a fragment shader it's already running on the GPU (at least on any hardware setup that would also support Cuda).

In theory, you could use Cuda to pre-compute lookup textures and then pass them into DirectX or OpenGL for use by a shader, but why bother? Either you'd be computing it, pulling it off the video card memory, just so you could put it back into video card memory for use by the lookup shader, or you'd be jumping through a bunch of extra hoops to make the Cuda data available to OpenGL.

Either way it's a bunch more complexity for zero benefit. There's absolutely no way any Cuda assisted version of distortion is going to be faster than the Shader version, unless nVidia has done something incredibly stupid, because they're both different ways of utilizing the same hardware.
Brad Davis - Developer for High Fidelity Co-author of Oculus Rift in Action

RWEye
Honored Guest
OK thanks for the information... as you may have guessed I am not very familiar with shaders!