Forum Discussion
hesham
11 years agoProtege
GearVR or DK2 HQ Rendering Layer in UE4
I was wondering if there is support on the Gear for UE4 to render certain materials or actors on a higher quality layer, eg: for UI presentation, etc... I know Oculus does this for their HUD and menu, is there something we can do in UE4? I don't just want a quad rendered with a high quality texture but an entire object in the world rendered in higher quality for display as the GearVR resolution isn't showing enough detail for that surface texture.
28 Replies
Replies have been turned off for this discussion
- vrdavebOculus StaffThat isn't implemented yet, but it should be within the next few months.
- heshamProtege
"vrdaveb" wrote:
That isn't implemented yet, but it should be within the next few months.
Any idea how it will be implemented? Will it be something where we can specify a material that force an object to render at a specific layer or will it just be an overlay quad of some sort just for menus and UI stuff? The former is more interesting. Also, would it be targeting 4.10 or 4.11? Thanks for that update though! - vrdavebOculus StaffDefinitely 4.11, but maybe available on Github before then. We would probably expose separate cameras for high-res UI and low-res world rendering. Thoughts?
- heshamProtege
"vrdaveb" wrote:
Definitely 4.11, but maybe available on Github before then. We would probably expose separate cameras for high-res UI and low-res world rendering. Thoughts?
I think that works as long as there is a way to do depth-buffer level blending of the two views automatically so the two cameras appear as part of the same render not just overlay the high res camera on the low res one, also transparency support would be important as well but I don't believe that would be an issue. Having it in GitHub earlier is also ideal for testing purposes.
In terms of usage, having the camera is a good option for manual and developer optimization, but also being able flag a UE4 material and have the engine automatically force rendering any actors using it on the high res camera by default would mean less manual work and simplicity for usage. Including the ability to flag UMG materials that way as well if desired (but not just UMG obviously :)) - owenwpExpert ProtegeIdeally you would want UMG and Slate to just work with the HQ predistorted layers. To include world space panels using the special quad renderer like with the Oculus Cinema screen.
- vrdavebOculus StaffYou can already create world- or face-locked quads that bypass distortion and use high-quality sampling via the OVROverlay script. What we haven't done yet is let you render arbitrary meshes and shaders to separate sets of eye buffers.
- heshamProtege
"vrdaveb" wrote:
You can already create world- or face-locked quads that bypass distortion and use high-quality sampling via the OVROverlay script. What we haven't done yet is let you render arbitrary meshes and shaders to separate sets of eye buffers.
I can't seem to find any reference to OVROverlay in UE4's source. Is that exposed in any way right now? Would it work on both DK2 and Gear VR?
In either case it is the second case that is more interesting. I noticed that VR Cinema and Netflix don't really have the screen obstructed in any way, if that's the case then the overlay makes sense and is much higher quality, but what I'd like is the ability to render such a screen or curved surface or object in the world itself and hopefully have it seamless enough to just work based on a flag of some sort. Hopefully that is doable from an integration perspective. - vrdavebOculus StaffSorry, OVROverlay doesn't actually exist in the UE4 integration yet. We are currently working on the first overlay functionality in UE4 and should have more to talk about in a month or two. Arbitrary meshes aren't too hard for us to add, but it's harder to support proper occlusion. You generally have to set the alpha channel in higher layers to let lower layers show through.
- heshamProtege
"vrdaveb" wrote:
You generally have to set the alpha channel in higher layers to let lower layers show through.
That's a shame, as that's what the depth buffer is for :) I guess that would entail warping the depth buffer so that one can then use it in the post-warp pass is that it? It would be nice to have as an option and even if it is too expensive to do now, it would probably be the right call in a few years if we are trying to be as flexible as possible. - owenwpExpert ProtegeProbably the thing to do would be to to have your world space UI panel draw into the alpha channel of the framebuffer during the main render pass (any part with UI is alpha 255, any area with no UI or occluded is 0). Then the HQ predistorted pass could treat the alpha channel of the main layer as a stencil by using a destination alpha blend mode.
You could even simulate transparency on top of the panels by drawing alpha values between 0 and 255 for things like particles, if you add some shader complexity to get the right color values in the backbuffer.
Would require some engine changes in UE4 probably, but you can do it yourself in Unity by just writing shaders that explicitly set the alpha value of the output color.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago
- 2 years ago
- 8 months ago
- 6 months ago
- 3 years ago