Forum Discussion
VirtualWolf
10 years agoHonored Guest
Client Distortion Mesh
Hi,
I'm am trying to implement client rendering and I am having some difficulty with the distortion mesh. I can obtain distortion mesh data from the SDK, and the vertices make sense, but I am having problems getting sensible UVs. The OpenGL examples I have found simply take the various "tan eye space" values, multiply them by the UV scaling factor and then add the offset, but when I do this the majority of the values are greater than 1.0 (so not valid UVs).
I've spent a few days trying to figure out the sample code, but the single enormous application supplied with the SDK doesn't really lend itself to learning because the small fragments of code I want are fairly buried and the explanations in the documentation are a bit vague.
Is anybody able to point me to (or even write) a simple, concise explanation of what I need to do with these values in order to get UVs?
Thanks
I'm am trying to implement client rendering and I am having some difficulty with the distortion mesh. I can obtain distortion mesh data from the SDK, and the vertices make sense, but I am having problems getting sensible UVs. The OpenGL examples I have found simply take the various "tan eye space" values, multiply them by the UV scaling factor and then add the offset, but when I do this the majority of the values are greater than 1.0 (so not valid UVs).
I've spent a few days trying to figure out the sample code, but the single enormous application supplied with the SDK doesn't really lend itself to learning because the small fragments of code I want are fairly buried and the explanations in the documentation are a bit vague.
Is anybody able to point me to (or even write) a simple, concise explanation of what I need to do with these values in order to get UVs?
Thanks
2 Replies
- skaveeExplorerI implemented client side distortion rendering a while ago as a demo for my thesis that used fixed pipeline stuff and display lists. I later proceeded out of interest switch to a more modern approach with vbo's and shaders.
The overview of what happens is basically:
ovrHmd_CreateDistortionMesh:
The Vertices you get for the left eye will lie in the left plane of your windows while the right values on the right side. So basically two views into one window.
The texture coordinates are in TanEyeAngles, this means if you multiply these value by the distance from eye to hmd (UVScale) and add the vertical or horizontal offset (UVOffset) you get the final texture coordinate. This is why you need ovrHmd_GetRenderScaleAndOffset.
The conversion for the TanEyeAngles to UV is done in the vertex shader somewhere in the SDK there is a quite large .h/.cpp file with different shaders you can use.
The following is the shader I used for my project (without timewarp) so it is quite simple.
/*
Shaders.h
*/
// Vertex Shader
static const char* vertexShader =
"#version 110\n"
"uniform vec2 EyeToSourceUVScale;\n"
"uniform vec2 EyeToSourceUVOffset;\n"
"attribute vec4 Position;\n"
//"attribute vec4 Color;\n"
// Vignette Fade and TimeWarpFactor encoded into Pos.z and Pos.w
"attribute vec2 TexCoord0;\n"
"attribute vec2 TexCoord1;\n"
"attribute vec2 TexCoord2;\n"
"varying vec4 oColor; \n"
"varying vec2 oTexCoord0;\n"
"varying vec2 oTexCoord1;\n"
"varying vec2 oTexCoord2;\n"
"void main() {\n"
" gl_Position.x = Position.x;\n"
" gl_Position.y = Position.y;\n"
" gl_Position.z = 0.5;\n"
" gl_Position.w = 1.0;\n"
// Vertex inputs are in TanEyeAngle space for the R,G,B channels
// (i.e. after chromatic aberration and distortion).
// Scale them into the correct [0-1],[0-1] UV lookup space (depending on eye)
" oTexCoord0 = TexCoord0 * EyeToSourceUVScale + EyeToSourceUVOffset;\n"
" oTexCoord0.y = 1.0-oTexCoord0.y;\n"
" oTexCoord1 = TexCoord1 * EyeToSourceUVScale + EyeToSourceUVOffset;\n"
" oTexCoord1.y = 1.0-oTexCoord1.y;\n"
" oTexCoord2 = TexCoord2 * EyeToSourceUVScale + EyeToSourceUVOffset;\n"
" oTexCoord2.y = 1.0-oTexCoord2.y;\n"
" oColor = vec4(Position.w, Position.w, Position.w, Position.z);\n" // Used for VignetteFade.
"}\n";
// Fragment Shader
static const char* fragmentShader =
"#version 110 \n"
"uniform sampler2D Texture;\n"
"varying vec4 oColor;\n"
"varying vec2 oTexCoord0;\n"
"varying vec2 oTexCoord1;\n"
"varying vec2 oTexCoord2;\n"
"void main() {\n"
//"gl_FragColor = oColor * texture2D(Texture, oTexCoord1);\n"
" gl_FragColor.r = oColor.r * texture2D(Texture, oTexCoord0).r;\n"
" gl_FragColor.g = oColor.g * texture2D(Texture, oTexCoord1).g;\n"
" gl_FragColor.b = oColor.b * texture2D(Texture, oTexCoord2).b;\n"
" gl_FragColor.a = 1.0;\n"
"}\n";
Here my vbo example, it was for testing purposes so don't expect a clean coded source file (but it works) and I think it is quite straight forward to understand. The only thing I did not get to work was direct mode.
CRender.cpp
http://pastebin.com/BChxCgPR
CRender.h
http://pastebin.com/875r1M7v
Shaders.h
http://pastebin.com/hgY1TTiN
Put this in your main.cpp
oculusVr test;
test.ovrInit(-1); // debug hmd
test.ovrSetParams(640,480); // window size
test.oglInit();
test.ovrMeshVbo();
test.loadShader();
test.loadTexture();
test.oglRender();
test.oglDeinit();
test.ovrDeinit(); - obzenExpert ProtegeWondering if anyone tried to 'reverse engineer', so to speak, that part of the process for ray casting.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 11 months ago
- 10 months ago
- 6 months ago
- 5 months ago