Hi, I'm trying to integrate the API into an existing game engine and am having a bit of trouble. As near as I can tell, I have everything hooked up (render wise), but when I pass the texture to the API (endFrame), all I get is a black screen (no errors or warnings, compile time or run-time).
Things to note: -The engine uses DirectX 10.1. -The game is built in VS2010.
Here are the actions I have taken so far: -I checked the texture to see if it was successfully being rendered to by copying it into the swap chain's back buffer, then calling present as normal. -I tried to use PIX to debug the issue, but kept being stopped by a supposed unhandled exception when calling D3D10CreateDevice1. More confusingly, PIX indicated the problem was in d3d11.dll when our engine uses DirectX 10.1. I noticed that PIX has, apparently, not been supported in quite some time. No, the VS Graphics Debugger is not an option since I'm working in VS2010. -I tried NSight, the Nvidia graphics debugger, only to find "D3D10 devices are unsupported under Nsight".
If anyone has any suggestions on where I can go from here, or questions to help me clarify my predicament, they would be much appreciated.
Yes, I am working on a laptop with integrated graphics and a dedicated card. I am also using the debug mode for most of my development rather than the physical HMD as I don't have access to it most of the time.
I'm not sure how the debug mode would determine which adapter it is attached to, but I assume it would be the dedicated card. I have my dedicated card set as the preferred graphics processor in my NVIDIA control panel. When I run the OculusRoomTiny sample in debug mode, I can see from the adapter desc that it is in fact using the dedicated card and it runs/displays correctly. My program is also using the dedicated card. I suppose I can try forcing it to use the integrated card just for funnsies.
Looks like your issue is Optimus. Even if you think you force the Nvidia card, it won't work. You should try using a desktop with a dedicated graphics card. Optimus is a big problem, and will probably never work correctly.
Unfortunately, that is not my issue. I went ahead and loaded all of our code and assets onto a desktop with only a dedicated graphics card. Debug mode still only shows a black screen. With the Rift itself, create device succeeds, the light on the device turns blue, but nothing is displayed. Business as usual.
Please elaborate on "in order for things to work". I'm under the impression that the only thing the eye poses would affect is your ability control the camera with the motion tracking capabilities of the device. I don't recall reading anything that would suggest the eye poses are essential to rendering in general. Am I wrong on this point?
To answer your question, I am calling ovrHmd_GetEyePoses(), though I am not currently applying the returned eye poses to the camera.