Forum Discussion
Kadinf
12 years agoHonored Guest
Need help: How to move an object in tiny room demo directx
Hi,
I really love the idea of building something and being able to see it in "real life" with the oculus.
That said, I'm a total beginner with directX and openGL.
To get started, I played a bit with the tiny room demo and wanted to move/animate some cubes.
In a naive way I tried the following:
The animation of the cubes moving works fine this way, as long as one doesn't move the head. Then the scene lags a lot. With this setup, even when I have just a static scene (only loading the cubes' positions once and never updating them) it lags when moving the head, while the unmodified tiny room demo has no lags. So I guess my approach is wrong^^
Now my questions:
What is the correct way to move objects around in directX and how can I use it in the tiny room demo?
Is it possible to exchange the cube with a sphere in the tiny room demo?
(update 21.10.14)
Is there no one out there who wants to help me? I would appreciate any hint at all.
This is my first post, so if I did something wrong in formulating my question comments/suggestions to improve this matter are also welcomed.
I tried getting something similar to run in openGL based on this example: http://codelab.wordpress.com/2014/09/07/oculusvr-sdk-and-simple-oculus-rift-dk2-opengl-test-program/.
Instead of cubes I'm using spheres (glutSolidSphere). Here creating all spheres in every frame works out. At least in direct to HMD the image is stable (no lags) for both eyes. Only both images are shifted to the right, so no 3D experience is possible. Other people in this forum seem to have a similar problem when using openGl, so I guess it's a problem in the SDK. In extended mode, the 3D vision is ok, but lags are visible. I guess it comes from the fact, that the rift is then limited to the 60 Hz of my main monitor.
Is there a workaround?
On research I found this (jherico): http://www.reddit.com/r/oculus/comments/2gqo1r/opengl_working_in_direct_hmd_mode_using_direct3d/
-> do direct HMD with direcX and the normal rendering still in openGL
Has someone experience in integrating this in an existing project and can give me a hint?
I really love the idea of building something and being able to see it in "real life" with the oculus.
That said, I'm a total beginner with directX and openGL.
To get started, I played a bit with the tiny room demo and wanted to move/animate some cubes.
In a naive way I tried the following:
-Load a file containing the starting coordinates for the cubes
-Use the function "PopulateRoomScene" (modified, code at the end) to set up the scene in the init()
-Load a file containing new coordinates for the cubes (in the main loop which calls "ProcessAndRender()")
-Delete the scene and repopulate in "ProcessAndRender()"
void ProcessAndRender()
{
pRoomScene->Clear();
PopulateRoomScene(pRoomScene, pRender,pMyCube);
... (rest unmodifed)
The animation of the cubes moving works fine this way, as long as one doesn't move the head. Then the scene lags a lot. With this setup, even when I have just a static scene (only loading the cubes' positions once and never updating them) it lags when moving the head, while the unmodified tiny room demo has no lags. So I guess my approach is wrong^^
Now my questions:
What is the correct way to move objects around in directX and how can I use it in the tiny room demo?
- I read something about creating a vertexbuffer for an object only once and using a transformation on the stored vectors to achieve the movement, but I don't know how to apply it to the tiny room demo (where is the vertex buffer hidden,..)
I guess there has to be a way to get access to a single object (in my case every single cube), but in the moment I'm only able to move the whole scene (scene->World.Move(Vector3f(0, 0, 10));)
Is it possible to exchange the cube with a sphere in the tiny room demo?
void PopulateRoomScene(Scene* scene, RenderDevice* render,Cube* theCube)
{
FillCollection fills(render);
scene->World.Add(Ptr<Model>(*CreateModel(Vector3f(0, 0, 0), &Floor, fills)));
scene->World.Add(Ptr<Model>(*CreateModel(Vector3f(0, 0, 0), &Furniture, fills)));
float scale = 30.0f;
float halfSize = 0.05f / scale;
float heightOffset = 3.0f / scale;
float centerPositionTable[] = { -0.9, 0.8, 0.5};
Cube::CubeList* actCube;
Slab tmp;
SlabModel myQube;
for (int i = 0; i <= theCube->getNumberOfCubes(); i++)
{
actCube = theCube->getCube(i);
halfSize = actCube->radius/scale;
tmp = { -halfSize, -halfSize, -halfSize, halfSize, halfSize, halfSize, Color(actCube->color[0] * 255, actCube->color[1] * 255, actCube->color[2] * 255) };
myQube = { sizeof(Slab) / sizeof(Slab), &tmp };
scene->World.Add(Ptr<Model>(*CreateModel(Vector3f(actCube->x / scale + centerPositionTable[0], actCube->z / scale + centerPositionTable[1] + heightOffset, actCube->y / scale + centerPositionTable[2]), &myQube, fills)));
}
scene->SetAmbient(Vector4f(0.65f, 0.65f, 0.65f, 1));
scene->AddLight(Vector3f(-2, 4, -2), Vector4f(8, 8, 8, 1));
scene->AddLight(Vector3f(3, 4, -3), Vector4f(2, 1, 1, 1));
scene->AddLight(Vector3f(-4, 3, 25), Vector4f(3, 6, 3, 1));
}
(update 21.10.14)
Is there no one out there who wants to help me? I would appreciate any hint at all.
This is my first post, so if I did something wrong in formulating my question comments/suggestions to improve this matter are also welcomed.
I tried getting something similar to run in openGL based on this example: http://codelab.wordpress.com/2014/09/07/oculusvr-sdk-and-simple-oculus-rift-dk2-opengl-test-program/.
Instead of cubes I'm using spheres (glutSolidSphere). Here creating all spheres in every frame works out. At least in direct to HMD the image is stable (no lags) for both eyes. Only both images are shifted to the right, so no 3D experience is possible. Other people in this forum seem to have a similar problem when using openGl, so I guess it's a problem in the SDK. In extended mode, the 3D vision is ok, but lags are visible. I guess it comes from the fact, that the rift is then limited to the 60 Hz of my main monitor.
Is there a workaround?
On research I found this (jherico): http://www.reddit.com/r/oculus/comments/2gqo1r/opengl_working_in_direct_hmd_mode_using_direct3d/
-> do direct HMD with direcX and the normal rendering still in openGL
Has someone experience in integrating this in an existing project and can give me a hint?
No RepliesBe the first to reply
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago
- 4 years ago
- 8 months ago
- 6 months ago