Forum Discussion
rcapote
10 years agoHonored Guest
What is wrong with my projection?
I'm working on implementing OVR into my program, but have run into a weird issue where my triangles are being skewed incorrectly. Here is an image to demonstrate:
http://i.imgur.com/FtpGXQ6.png
Without oculus
http://i.imgur.com/lq2PJf1.png
So, the cube is not exact as I'm just holding the HMD to get a screenshot...but generally, when the HMD is horizontal (normal orientation) the cube looks rectangular, and when the HMD is rotated, the cube is more square. Where is my setup incorrect?
View matrix:
http://i.imgur.com/FtpGXQ6.png
Without oculus
http://i.imgur.com/lq2PJf1.png
So, the cube is not exact as I'm just holding the HMD to get a screenshot...but generally, when the HMD is horizontal (normal orientation) the cube looks rectangular, and when the HMD is rotated, the cube is more square. Where is my setup incorrect?
// ...
// Setup each eye
for(ovrEyeType eye = ovrEyeType::ovrEye_Left; eye < ovrEyeType::ovrEye_Count; eye = static_cast<ovrEyeType>(eye+1))
{
ovrSizei eyeTextureSize = ovrHmd_GetFovTextureSize(vrHmd, eye, vrHmd->DefaultEyeFov[eye], 1.f);
ovrTextureHeader& eyeTextureHeader = _vrContext.eyeTextures[eye].Header;
eyeTextureHeader.TextureSize = eyeTextureSize;
eyeTextureHeader.RenderViewport.Size = eyeTextureSize;
eyeTextureHeader.API = ovrRenderAPI_OpenGL;
}
// ...
ovrGLConfig cfg;
cfg.OGL.Header.API = ovrRenderAPI_OpenGL;
ovrSizei rtsize;
rtsize.w = _vrContext.resWidth;
rtsize.h = _vrContext.resHeight;
cfg.OGL.Header.BackBufferSize = rtsize;
cfg.OGL.Header.Multisample = 0;
int distortionCaps = 0
| ovrDistortionCap_Vignette
| ovrDistortionCap_Chromatic
| ovrDistortionCap_Overdrive
| ovrDistortionCap_TimeWarp;
// TODO: Special cases on Linux
int configResult = ovrHmd_ConfigureRendering(vrHmd, &cfg.Config, distortionCaps, vrHmd->DefaultEyeFov, _vrContext.eyeRenderDescs);
for(ovrEyeType eye = ovrEyeType::ovrEye_Left; eye < ovrEyeType::ovrEye_Count; eye = static_cast<ovrEyeType>(eye+1))
{
const ovrEyeRenderDesc& eyeRenderDesc = _vrContext.eyeRenderDescs[eye];
// Per Eye projection
_vrContext.eyeProj[eye] = toGLM(ovrMatrix4f_Projection(vrHmd->DefaultEyeFov[eye], 0.1f, 1024.f, true));
_vrContext.eyeOffsets[eye] = eyeRenderDesc.HmdToEyeViewOffset;
_vrContext.eyeFbo[eye] = GL::createFrameBuffer(
_vrContext.eyeTextures[eye].Header.TextureSize.w, _vrContext.eyeTextures[eye].Header.TextureSize.h);
auto& eyeTextureHeader = _vrContext.eyeTextures[eye];
((ovrGLTexture&)eyeTextureHeader).OGL.TexId = _vrContext.eyeFbo[eye].colorBuffer;
}
View matrix:
return glm::scale(glm::inverse(VR::toGLM(VR::_vrContext.eyePoses[eye])), glm::vec3(OVR_DEFAULT_IPD));
5 Replies
- cyberealityGrand ChampionThis could be an aspect ratio issue, or possibly incorrectly set FOV.
- jhericoAdventurerCan you show the code where you're actually applying the projection to the rendering? i.e. setting the projection uniform ( or calling glLoadMatrix if you're using the older fixed function pipeline ).
- rcapoteHonored GuestIn my OP the code I posted a clip marked as "view matrix", which is the function called VR::getViewMatrix(eye). I'm currently testing in extended mode, and I default the window size to 1920x1080. Each eye texture is {w=1182 h=1461 }
In my loop, for each eye I call:
glm::mat4 viewMat = VR::getViewMatrix(eye);
_context.projViewMatrix = VR::_vrContext.eyeProj[eye] * viewMat;
My function for assigning the view matrix to the shader
void setUniformMat4(const char* name, glm::mat4 matrix)
{
GLuint location = glGetUniformLocation(RenderContext::_context.boundShader.program, name);
glUniformMatrix4fv(location, 1, GL_FALSE, glm::value_ptr(matrix));
} - jhericoAdventurer
"rcapote" wrote:
glm::mat4 viewMat = VR::getViewMatrix(eye);
_context.projViewMatrix = VR::_vrContext.eyeProj[eye] * viewMat;
Granted, you can compose the view and projection like this, but most people don't. Particularly when it comes to lighting, it's very important to be able to access the modelview matrix independently of the projection matrix."rcapote" wrote:
void setUniformMat4(const char* name, glm::mat4 matrix)
{
GLuint location = glGetUniformLocation(RenderContext::_context.boundShader.program, name);
glUniformMatrix4fv(location, 1, GL_FALSE, glm::value_ptr(matrix));
}
You might want to try adding some logging or a breakpoint in order to verify that the location being returned isn't -1 (although this seem unlikely as the cause of your error since if it was, you wouldn't see any change in the image as you moved the Rift, since you're putting the view transform in the same matrix).
What about your vertex shader, what does that look like? - rcapoteHonored GuestVertex shader is nothing special. I currently don't use the TexCoords.
#version 330
layout(location=0) in vec4 VertexPosition;
layout(location=1) in vec2 UV;
uniform mat4 uTransformMatrix;
out vec2 TexCoords;
out vec4 VertColor;
void main(void) {
gl_Position = uTransformMatrix*VertexPosition;
if(VertexPosition.x > 0) {
VertColor = vec4(1.f, 0.f, 0.f, 1.f);
} else {
VertColor = vec4(0.f, 1.f, 0.f, 1.f);
}
TexCoords = UV;
}
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 5 months agoAnonymous
- 2 months ago