cancel
Showing results for 
Search instead for 
Did you mean: 

Texturing a sphere

AlexisPontin
Honored Guest
Hello,

I have a problem trying to upload a texture on a sphere. The goal is to upload an equirectangular image on a the sphere so that we can rotate the sphere as the user moves the head. I manage to render a cube with different colors on each face to check the movement.

I'm using Qt5 with QGLWidget and Qt openGL interfaces (QGLShaderProgram, etc.)

Now I'm trying to render a sphere using my own shaders :

vertex shader
uniform mat4 mvp_matrix;

attribute vec4 a_position;
attribute vec2 a_texcoord;

varying vec2 v_texcoord;

//! [0]
void main()
{
// Calculate vertex position in screen space
gl_Position = mvp_matrix * a_position;

// Pass texture coordinate to fragment shader
// Value will be automatically interpolated to fragments inside polygon faces
v_texcoord = a_texcoord;
}


fragment shader
uniform sampler2D texture;

varying vec2 v_texcoord;

//! [0]
void main()
{
// Set fragment color from texture
gl_FragColor = texture2D(texture, v_texcoord);
}


Initialization of the data in the Vertex Buffer Object :

void GeometryEngine::initSphereGeometry(float hfov, float vfov) {

float rHFov = hfov * M_PI / 180;
float rVFov = vfov * M_PI / 180;
QVector<GLfloat> data;

for (int i = 0; i <= NUM_PARALLELS; i++) {
float v0 = (i - 1) / (float)(NUM_PARALLELS);
float lat0 = rVFov * (-0.5f + v0);
float z0 = sinf(lat0);
float zr0 = cosf(lat0);

float v1 = i / (float)(NUM_PARALLELS);
float lat1 = rVFov * (-0.5f + v1);
float z1 = sinf(lat1);
float zr1 = cosf(lat1);

for (int j = 0; j <= NUM_MERIDIANS; j++) {
float u = (j - 1) / (float)NUM_MERIDIANS;
float lng = rHFov * u;
float x = cosf(lng);
float y = sinf(lng);

data.push_back(x * zr0); //X
data.push_back(y * zr0); //Y
data.push_back(z0); //Z

data.push_back(u); //U
data.push_back(v0); //V

data.push_back(x * zr1); //X
data.push_back(y * zr1); //Y
data.push_back(z1); //Z

data.push_back(u); //U
data.push_back(v1); //V

}
}
vbo.bind();
vbo.allocate(&data.front(), data.size() * 4);
vbo.release();
}


This function is called by paintGL() from the widget.
void GeometryEngine::drawSphereGeometry(QGLShaderProgram *program) {

vbo.bind();

int vertexLocation = program->attributeLocation("a_position");
program->enableAttributeArray(vertexLocation);
glVertexAttribPointer(vertexLocation, 3, GL_FLOAT, GL_FALSE, 5 * sizeof(float), 0);

int texcoordLocation = program->attributeLocation("a_texcoord");
program->enableAttributeArray(texcoordLocation);
glVertexAttribPointer(texcoordLocation, 2, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (const void*)(3 * sizeof(float)));

glDrawArrays(GL_TRIANGLE_STRIP, 0, 2 * (NUM_PARALLELS + 1) * (NUM_MERIDIANS + 1));

vbo.release();
}


I have several questions :

  • What data should I send into the "mvp_matrix" so that the sphere stays at the center of the screen. I'm quite lost with all the model-view-projection matrix, but I'm trying to understand how it works. I'm confused by the fact that we have 2 steps : the drawing of the sphere, and the distorsion done by the SDK.

  • How should I upload the image on the sphere. For now, I generated 2 textures ("id = 1" for the image, and "id = 0" which is passed to the SDK for distorsion rendering). I don't know how to deal with this 2 textures. In paintGL I need to send the texture id to the fragment shader using
    program.setUniformValue("texture", 1);
    but I only get a black sphere.


Has someone already done this kind of project who could help me ?
Thanks in advance 🙂
3 REPLIES 3

vrdaveb
Oculus Staff
>>> What data should I send into the "mvp_matrix" so that the sphere stays at the center of the screen.

You should use standard perspective projection and model matrices and a stereo view matrix. This means the sphere will be a little to the right for your left eye and a little to the left for your right eye. The projection matrix should use the FOV that you get from the SDK and the view should correspond to the two eye positions, which you can also get from the SDK. Check out OculusWorldDemo and Chapter 4 of the 0.3.1 SDK documentation here: http://static.oculus.com/sdk-downloads/documents/Oculus_SDK_Overview_0.3.1_Preview.pdf.

>>> How should I upload the image on the sphere. For now, I generated 2 textures ("id = 1" for the image, and "id = 0" which is passed to the SDK for distorsion rendering).

Generally, you use glGenTextures, glBindTexture, glTexImage2D, and glTexParameter.. to set up textures. Since SDK 0.3.1 changes the texture binding to render distortion, you will need to call glBindTexture(1) again each frame before you draw.

AlexisPontin
Honored Guest
Thank you! I now have my texture mapped to my sphere using my own shaders. But now, the result is not what I really want to obtain. I would like to have the camera inside the sphere and the textured mapped inside too.

I don't know how to modify the ModelView / Projection matrix of the sphere to get such result 😕

I have also a problem which may be linked to that, it's the fact that images are not superimposed in the oculus. The shpere is well rendered but the texture on each one doesn't match.

Here is what I have so far :

unclebob
Honored Guest
Is this the type of thing you are trying to do?

viewtopic.php?f=42&t=10267

Cheers
Feel free to drop by....http://www.VRCraftworks.com and contact us
Still need help?

Did this answer your question? If it didn’t, use our search to find other topics or create your own and other members of the community will help out.

If you need an agent to help with your Meta device, please contact our store support team here.

Having trouble with a Facebook or Instagram account? The best place to go for help with those accounts is the Facebook Help Center or the Instagram Help Center. This community can't help with those accounts.

Check out some popular posts here:

Getting Help from the Meta Quest Community

Tips and Tricks: Charging your Meta Quest Headset

Tips and Tricks: Help with Pairing your Meta Quest

Trouble With Facebook/Instagram Accounts?