Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
lamour42's avatar
lamour42
Expert Protege
11 years ago

Client Rendering Evaluation

Hi,

after my failure to get SDK rendering to work I went back to client rendering and am quite happy with the result. I would like to share some thoughts.



My project is a pure DirectX 11 environment compiled with Visual C++ 2013. Almost all code is written by me - some parts I have taken from the excellent book 'Introduction to 3D Game Programming with DirectX 11' from Frank D. Luna.

I create my 3D objects in Blender, then export them as Collada xml files. These are then parsed by a Java based tool and written back in custom binary format. Finally this binary format is read from the C++ program on startup. (If you thing that sounds complicated you are right. It took me many months and many dead ends to come up with a working environment that didn't involve insanely expensive 3D modelling software and was fitting to my workflow.)

I started to enable my engine to render to the Oculus Rift some months ago. This was before DK2 was out and I didn't even had a DK1. I decided to go along the DK1 dev guide and implement a version that looked ok in my application windows, but could never be tested on real hardware.

After I got the DK2 I tried to implement SDK rendering for my engine and failed. (See this thread viewtopic.php?f=20&t=17288) This is where I decided to return to Client Rendering.

Disruptive SDK update

I will not complain about version 3.3 being very different from 3.2. This is beta and thing like that should be expected. It took me some time to find out its actually best to throw away all my DK1 code and start from scratch. Once the Rift is released I would hope for more compatible changes, though :-)

Debug Device

I am doing most of my development without having a real Oculus Rift connected. Or I have one connected but it is switched off. So I am very grateful for the debug device, that works very well in general. You can always wish for more - e.g. simulated sensor input, but it's not that important. However, I discovered what I think to be a small bug with the debug device:

If using ovrHmd_GetRenderDesc like you normally do for client rendered distortion, I get a failed assertion exception in LibOVR. So I have to use ovrHmd_ConfigureRendering to initialize the ovrEyeRenderDesc structures. This makes no sense as this API is intended for SDK rendering.

		if (ovrRenderingDebugDevice) {
if (!ovrHmd_ConfigureRendering(hmd, &d3d11cfg.Config, ovrDistortionCap_Chromatic | ovrDistortionCap_TimeWarp | ovrDistortionCap_Overdrive,
eyeFov, EyeRenderDesc)) {
Error(L"Oculus Rift ConfigureRendering failed.");
}
} else {
EyeRenderDesc[0] = ovrHmd_GetRenderDesc(hmd, ovrEyeType::ovrEye_Left, eyeFov[0]);
EyeRenderDesc[1] = ovrHmd_GetRenderDesc(hmd, ovrEyeType::ovrEye_Right, eyeFov[1]);
}


Exception on exit

I get an IllegalAccess Exception on exit. Stacktrace shows no active code of mine, just some Oculus DLL.
May be entirely on my, but I think I did everything the documentation told me to do for shutting down Rift usage.

Up vector wrong

This is entirely on me. While I could get the transition of HeadPose.ThePose.Orientation quaternion into my left handed system to work for the look at vector, I fail to do so correctly for the up vector. I thought the same rotation matrix that works for the look at vector should work for the up vector. Certainly my own problem and I should give it some thinking time. But maybe someone else had a similar issue?

DirectX Integration

I have mixed opinion here. As a DirectX programmer I just don't like to have to adapt to yet another coordinate, math and type system. My code is entirely based on DirectXMath library. I would like to have an Oculus SDK based on DirectXMath. Well, I know I am selfish here :-) I fully understand that Oculus wants to have a single set of API and documentation and not provide different versions for OpenGL and DirectX. But what is easier for Oculus is not necessarily easier for us programmers. Also, the SDK is only one side of the story. As we see in the SDK samples you have to differentiate for your target environment at one point in your code anyway. IMO a separate set of SDK and samples for OpenGL and DirectX would help clarify how to program the Rift for both worlds. Certainly this means more work for Oculus, so I know it's easy for me to say ;-)

But now comes the good part of DirectX integration with Client rendering: It just behaves like any other non-Rift enabled DirectX app. Especially I can use the awesome DirectX Graphics debugger of Visual C++ 2013. The distortion vertex and pixel shader are just another step in the pipeline and I can select their Draw calls like any other Draw calls, debug the shader code, look at input parameters, etc. This feature alone is reason enough to go with client rendering instead of SDK rendering, which prevented Graphics debugging altogether - also for the non-rift related Shader Code.

Direct HMD Access

What a great feature! After reading the documentation I wasn't even sure if this mode would work with Client rendering at all. Very well it does work indeed. After all these countless times of having to fumble around getting some demo or my own stuff to display on the Rift by half-blinded moving windows around, it feels so gratifying to just start your app, see the Rift LED turn to blue and have a nice fullscreen image with no desktop artefacts on the Rift.

Overall

While I was disappointed with SDK rendering, I now fully enjoy programming the Rift with Client mode. Standing in the middle of your own virtual world for the first time, all around you rendered by your own code, was one of the greatest moments in my programming life. I sincerely thank the Oculus Team for that moment.