Forum Discussion
thewhiteambit
11 years agoAdventurer
Direct HMD Access with Client-Distortion? (Raytracing)
Is it possible to use Direct HMD with client distortion? I have to do client-distortion since I use Raytracing for picture creation - this saves me rendering everything twice to intermediate buffers and picture is less blurry since every pixel matches 100% instead of being sampled from a blurry texture. It is also much faster on simple scenes.
Now I want to use Direct HMD Access for 75Hz and convenience. My idea is to write directly to one Rendertarget that is viewed as is without distortion. Is there any possibility to do this with SDK 0.4.1 or are there any plans on this for future releases?
PS: This could also be usefull with classic Scanline rendering in a deferred renderer where f.e. textureing is done after distortion.
Now I want to use Direct HMD Access for 75Hz and convenience. My idea is to write directly to one Rendertarget that is viewed as is without distortion. Is there any possibility to do this with SDK 0.4.1 or are there any plans on this for future releases?
PS: This could also be usefull with classic Scanline rendering in a deferred renderer where f.e. textureing is done after distortion.
31 Replies
- ElectricMucusExplorerafik you can do that, sorta. There are two modes how distortion rendering is supposed to happen.
One where you simply get the rendering target and let the sdk handle everything and another one where you get a pointer to the distortion mesh and handle it yourself.
Now I think using that mesh probably isn't what you'd like if you want to simply ray trace through the lens, so I'd start digging into the implementation of the distortion grid to get something like lens focal lengths, it should be in there, in some form or another. - thewhiteambitAdventurerThanks, I will try it this way :P Btw. I raytrace through both lenses in one pass of course, since RT can have multiple ray origins without extra cost in one pass - and of course there is a lot of invisible black area one can skip. I think a way of deferred RT raycasting will be good choice for bigger viewports to come. It is also nice that picture remains crisp since it is not sampled from 2D-buffers. Currently I even tried to render some YUV 4:2:2 like color bufferes to have exact control over subpixels of the pentile matrix in the center of the DK2 - you won't believe how much this improves rendering quality :shock:
- ElectricMucusExplorerVery interesting, do write it in OpenGL fragment shaders? I'm looking into doing something similar but I'm far from actually having something to show. (The OpenGL superbible arrived today :D )
Personally I don't think a pure ray tracer would be performant enough for me so I'm aiming for a hybrid approach where I ray cast implicit primitives onto imposters created with geometry shaders and let the z-buffer do the visibility test.
As for rendering with sub pixel precision, that's my goal as well. 8-) - thewhiteambitAdventurerI guess it will not help. My problem is not findig the correct distortion values, but using Direct HMD without distortion. But it seems only SDK distortion rendering can access Direct HMD mode right now :x
- ElectricMucusExplorerThat was just me what I remember out of the Developer Guide, IDK perhaps that was just for 0.3.2. (I can't use any other version any how as I only have a Linux machine capable of running the rift ATM)
I'd be not as concerned about direct mode as long as we don't even know what it does exactly. Creating a window over the HMD screen coordinates independently of the SDK should work just fine. I find the notion to let the SDK create the window annoying anyway, it's vendor lock-in bait. - thewhiteambitAdventurerI guess Direct HMD is for capability to use 75Hz, otherwise both displays would be synced to probably 60Hz. If it is ok for you I will copy our private chat here, because this could be interesting for others also. I have already completed extended desktop rendering, thats why i started wondering about Direct HMD :D
- ElectricMucusExplorerno it's not
Anyhow I made a small illustration graphic to show what you are probably hinting at.
This should be fairly useful for archiving sub-pixel resolution images, not only for ray tracing, I think it would be awesome if that can be done in a regular rasterization engine, if it can be done. :) "thewhiteambit" wrote:
I guess it will not help. My problem is not findig the correct distortion values, but using Direct HMD without distortion. But it seems only SDK distortion rendering can access Direct HMD mode right now :x
Client mode rendering can access direct mode. I'm currently doing that with Ogre3D. It's only working for me in DirectX 11 so far though.
You didn't mention how you do the raytracing. Is it purely software rendering into the window or are you using DirectX or OpenGL as the base with shaders to raytrace? Direct HMD mode only works if you are using DirectX or OpenGL, when you call ovr_Initialize(); it actually opens a bunch of dlls (dx9, dx11, opengl, etc) and hijacks them so later calls by your own code will go to a replacement oculus library instead (which supports the direct mode). So it appears you can't attach direct mode to just any window, it has to be one that will have directx or opengl later bound to it."ElectricMucus" wrote:
Personally I don't think a pure ray tracer would be performant enough for me so I'm aiming for a hybrid approach where I ray cast implicit primitives onto imposters created with geometry shaders and let the z-buffer do the visibility test.
I was doing something a little like that:
The ground is conventional rendering. The green sphere is ray tracing through a cube mesh frame with a sphere intersect equation. The white sphere is ray marching through a cube mesh frame with a distance field. Both generate valid z buffer values so they can intersect with other conventional objects in the scene. Distorting the cube mesh (just 12 triangles) will change the shape of the ray traced / ray marched contents too (stretch the cube and it will draw an ellipse, etc).- thewhiteambitAdventurerPentile AB.png
with raytracing no complicated rotation is needed. you can calculate suppicture A (just green) with just green rays, and then an interleaved (blue/green) one as layer B. As I mentioned it is more like a RGB 4:8:4 or GBR 8:4:4 not exactly YUV 4:2:2 - since it is not chromatic etc and has more BitPerPixel - even GBR 32:16:16 is useful for floats. chromatic aberration is done here also. this techniques could basically also apply for scanline. one can imagine a shader writing to the warped texture doing some multisampling on the fly. this could be a good compromise between convenient scanline techniques and picture quality. you can also implement a stencil to have only samples taken where needed. then one can also calculate the subsample with the position it takes in the warped picture. hey john, something for you in here? ;) - thewhiteambitAdventurerCurrently I only use directx11 shaders in a wpf shared directx 9 texture as rendertarget :? . But it sucks as WPF is somehow limited to 60Hz. So I will have a Windows-Forms soon and keep C# code - one can render with 1000Hz directly to the HWND there. I implemented a lousy CLR-wrapper for oculus VR. Advantage of wpf is its smooth framerate with good estimation on when the frame will be visible. I also did some experiments with C++ AMP wich is very nice for raytracing. but you can also use a shared directx texture to write to. So it is really easy implementing complex raytracers. But before photonmapping - I would also suggest a simple deferred raycast. basically you need depth, normals, maybe positions on a shadowmap, UV-coordinates for a megatexture. Also implementing animations is still hard with RT, that's why I came up with the scanline improvements (and as always in graphics, I bet there is some guy having written a book on this in the 70 :D ). But a deferred scene would have many advantages also with scanline for the oculus. For example you can do just one exact texture lookup without two huge intermediate buffers were most pixels are not even used...
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 10 years ago
- 11 years ago