Forum Discussion
MOnsDaR
13 years agoHonored Guest
DirectX 9 Integration
I'm currently diving into developing with the Rift. I've got access to a closed source branch of the Arma 1 engine (VBS2) where it is possible to get the D3Device (Doc Link).
I've done my first steps with DirectX, but I can't wrap my head around what I need to do for rendering my scene with different viewports and and applying the shaders.
Are there any open source examples or even tutorials out there, showing how to do that?
I found some topics and projects related to that, but I'm not able to get it all together.
I've done my first steps with DirectX, but I can't wrap my head around what I need to do for rendering my scene with different viewports and and applying the shaders.
Are there any open source examples or even tutorials out there, showing how to do that?
I found some topics and projects related to that, but I'm not able to get it all together.
6 Replies
- cyberealityGrand ChampionWhat you need to do is render the scene to a frame buffer (render to texture) instead of the back buffer. So you can create one of these for each eye. Then render to them with a different view matrix that is offset to the side (to simulate the spacing between the eyes). You will then want to render these to a full screen quad the size of the screen. Before this step you would run it through a distortion/pre-warp pixel shader. That is one way to do it.
- FalamboHonored GuestI am currently trying to integrate Oculus Rift support into one of my own applications.
The problem I currently have is that I am unsure how to combine the two Eye Render Targets into one Texture. Should I do that in a pixel shader? - cyberealityGrand ChampionYou can do that in a pixel shader, it's not that hard.
You can also just have 2 quads (one for each eye) and that can work as well. - owenwpExpert ProtegeWhat you probably want is to use SetViewport. This lets you render into a specific rectangular area.
Also rendering directly into a texture is not ideal in D3D9, because then you cant use antialiasing. Typically you should render to the backbuffer as usual, then copy the backbuffer to a texture which you can use for your shader.
I believe the OVR SDK shows how to do all of this. - FalamboHonored GuestOk thank you for your answers.
Another question I have is concerning the Direct3D10 distortion pixel shader in the sdk overview document. For a DirectX11 shader I would combine the variables for scaling into a constant buffer and then update the whole thing instead of doing it for each variable individually like in the example, right? - owenwpExpert ProtegeYou could, but that isn't really necessary because D3D will automatically make a single constant buffer for all global constants. Doing it explicitly is only important when you have constants that vary at different rates, like per frame vs per object.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 7 years ago