Forum Discussion
jherico
12 years agoAdventurer
Steam VR API issues
I tried posting this in the Steam VR discussion forums, but so far I've gotten no response at all. I'm wondering if anyone else has tried working with the Steam VR API in OpenGL and encountered the same issues.
Cross posted message:
I'm attempting to write a small sample program that renders a rotating color cube using the SteamVR API, but I've noticed a number of issues... not sure what's down to my use of the API and what might be bugs.
I've got a Rift DK1 connected.
This is populating vppos.y with 800, which is the vertical resolution of the Rift. However, that doesn't make sense for the viewport position. For the Rift DK1 the sizes should both be 640x800 and the positions should be 0,0 for the left eye and 640,0 for the Right. I'm getting back correct values for everything except the Y position, which is always 800.
Next up is the modelview matrix. I'm using GLM to store and use all my matrices, so I'm using the following code to convert from the Steam matrix type to the GLM one:
and then this code to apply the projection matrix and the per-eye modelview offset in my rendering:
Here's a image of my output. That's a 6 cm cube rendered at a distance of 20 cm from the camera. It's clear from both the positioning of the cubes in the frame as well as from the total area of the red side for each that the per-eye modelview offset is inverted. I should see more of the right side of the cube from the right image, not the left one. Compare with the output of an application using the Oculus SDK.
If after setting up my matrices, I swap them, like so:
Then the rendering looks correct, as far as I can tell.
Finally, as you can see from the output of the two applications, the SteamVR implementation isn't scaling up to the full screen. The Oculus SDK uses a post-distortion scale, so that after you've computed the warp effect (which imposes a shrinking effect) you apply a scale factor so that the distorted image reaches (or at least approaches) a fit point, typically the left edge of the screen on the DK1. The SteamVR API looks like it's using the same distortion values, but without the post-distortion scale, so much of the FOV is lost at best, or at worst, the FOV of the rendered image doesn't match the perceived FOV, which can induce motion sickness.
Cross posted message:
I'm attempting to write a small sample program that renders a rotating color cube using the SteamVR API, but I've noticed a number of issues... not sure what's down to my use of the API and what might be bugs.
I've got a Rift DK1 connected.
pHMD->GetEyeOutputViewport(eye, vr::API_OpenGL, &vppos.x, &vppos.y, &vpsize.x, &vpsize.y);
This is populating vppos.y with 800, which is the vertical resolution of the Rift. However, that doesn't make sense for the viewport position. For the Rift DK1 the sizes should both be 640x800 and the positions should be 0,0 for the left eye and 640,0 for the Right. I'm getting back correct values for everything except the Y position, which is always 800.
Next up is the modelview matrix. I'm using GLM to store and use all my matrices, so I'm using the following code to convert from the Steam matrix type to the GLM one:
vr::HmdMatrix44_t matrix = pHMD->GetProjectionMatrix(eye, 0.01f, 1000.0f, vr::API_OpenGL);
memcpy(&ea.projection[ 0 ], matrix.m, sizeof(float)* 16);
ea.projection = glm::transpose(ea.projection);
matrix = pHMD->GetEyeMatrix(eye);
memcpy(&ea.modelview[ 0 ], matrix.m, sizeof(float)* 16);
ea.modelview = glm::transpose(ea.modelview);
glm::mat4 mm = glm::translate(glm::mat4(), glm::vec3(0.0317500010f, 0, 0));
and then this code to apply the projection matrix and the per-eye modelview offset in my rendering:
pm.push().top() = ea.projection;
mv.push().preMultiply(ea.modelview);
Here's a image of my output. That's a 6 cm cube rendered at a distance of 20 cm from the camera. It's clear from both the positioning of the cubes in the frame as well as from the total area of the red side for each that the per-eye modelview offset is inverted. I should see more of the right side of the cube from the right image, not the left one. Compare with the output of an application using the Oculus SDK.
If after setting up my matrices, I swap them, like so:
std::swap(eyeArgs[ 0 ].modelview, eyeArgs[1].modelview);
Then the rendering looks correct, as far as I can tell.
Finally, as you can see from the output of the two applications, the SteamVR implementation isn't scaling up to the full screen. The Oculus SDK uses a post-distortion scale, so that after you've computed the warp effect (which imposes a shrinking effect) you apply a scale factor so that the distorted image reaches (or at least approaches) a fit point, typically the left edge of the screen on the DK1. The SteamVR API looks like it's using the same distortion values, but without the post-distortion scale, so much of the FOV is lost at best, or at worst, the FOV of the rendered image doesn't match the perceived FOV, which can induce motion sickness.
No RepliesBe the first to reply
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 4 months ago
- 18 days ago
- 10 months ago
- 2 months ago