Showing results for 
Search instead for 
Did you mean: 

Normal Maps Rendered Per Eye

Honored Guest
I have a question that may seem stupid to some of you. It’s generally accepted that normal maps don’t work in VR except for minute details because we have a stereoscopic view in VR. But can’t we make a shader that calculates what a normal map does to the object’s lighting per eye to restore the illusion?
It must be that this won’t work because the solution sounds so simple that someone must have tried it in the last 10 years and it’s not common. But maybe someone could explain why it wouldn’t work to those of us with smaller brains.


Certainly, we possess the capability to calculate everything solely using surface triangles, even without a normal map. However, this approach proves highly time-consuming during real-time rendering.

Normal maps serve the purpose of enhancing rendering details in PC games without increasing count of the triangles .

In VR games, where rendering must occur separately for the left and right eyes, the absence of a normal map in some instances is to optimize FPS (frames per second).


Normal maps work fine in VR. If you mean that they don’t add depth - yes, but that’s true of traditional games too. It’s just easier to see the lack of depth in stereo.