Forum Discussion
broozar
10 years agoHonored Guest
[0.8 PC GL] copy texture to TextureSet
hi all,
back in the 0.5 days before textureSets, one could simply set one's own textureID to eyeTex.OGL.TexId and call it a day. Could anyone please offer a code snippet/workflow description of how to do something similar with 0.8 in OpenGL?
what I tried:
- glBindTexture (my own texture (GLuint handle) with a combined left/right image)
- get ovrGLTexture* tex = (ovrGLTexture*)&pTextureSet->Textures[pTextureSet->CurrentIndex]
- glCopyTexSubImage2D (target tex->OGL.TexID)
regards
back in the 0.5 days before textureSets, one could simply set one's own textureID to eyeTex.OGL.TexId and call it a day. Could anyone please offer a code snippet/workflow description of how to do something similar with 0.8 in OpenGL?
what I tried:
- glBindTexture (my own texture (GLuint handle) with a combined left/right image)
- get ovrGLTexture* tex = (ovrGLTexture*)&pTextureSet->Textures[pTextureSet->CurrentIndex]
- glCopyTexSubImage2D (target tex->OGL.TexID)
regards
30 Replies
- cyberealityGrand Champion
- Unfortunately the docs just show that this isn't the accepted workflow anymore (used to be, before 0.8).
I've been struggling with the directx 11 version of this problem for my wrapper for the Ogre3D engine. In previous rift sdks you gave the sdk the texture created by your engine and it used it. Fine. Now in 0.8 the rift sdk must create the texture and you render into it. Not all engines like rendering into textures they don't own.
I've been unable to trick the sdk into using an existing dx11 texture (even with identical flags, res, pixel format, etc). It just displays black. There's no errors reported by the oculus sdk (I've got all the logging stuff on, getting spammed by it, none of it useful for this issue) and of course we can't debug it because that part is now closed source (sigh). The best I've done is use dx11's surface copy function to blit the Ogre3D texture over the top of the oculus texture every frame. That works, but is an extra render step that shouldn't be needed.
I need to get it working in opengl too, but I haven't looked at that yet (I don't know opengl as well).
What the sdk SHOULD do is provide the texture creation functions like it does now, but also offer the ability to pass in existing textures with a warning that things may break if they don't meet a specific (documented!) format requirement. - cyberealityGrand ChampionHmm.. I think I see what you're saying. Let me see if anyone here knows of a work-around.
- Ok, maybe earlier than 0.7 it seems. I didn't try coding for 0.6 or 0.7. :)
But anyway, it should be possible to give external textures to the rift sdk. (It also seems like it should be possible, we have access to the texture sets as broozar showed, it just doesn't work and has no feedback on why) - broozarHonored Guest@kojack sounds like we are in a similar boat, no errors, all black. The ID override method stopped working in 0.6. As OVR hits the stores soon, we are forced to make it work with the latest SDKs, since 0.5 is deprecated.
@cyber yes, please. To be clear, we want to give a "flat" side-by-side texture to the SDK, and then let the SDK do the distortion etc. - not direct mode. Code samples in both GL and DX would be nice. - cyberealityGrand ChampionOK, so I'm really sorry but it seems the SDK won't work this way. I don't totally get all the specifics myself but, from what I understand, it seems the SDK is not using normal textures. There is some special sauce that happens in the SDK that makes these special textures not shareable with a client application. It seems the only option is to manually copy the texture over every frame.
- That's a shame.
I've got some ideas of what they might be doing, but I can't be sure. I might investigate further for the hell of it, but there's not much point.
This is why I was always against the idea of sdk rendering instead of client rendering. We create the directx device (opengl context), then the oculus sdk goes off and does secret undocumented stuff to it, possibly leaving it in an unexpected state.
If oculus want to keep things like their sensor fusion, camera tracking, hardware interfacing, etc secret, fine. But they don't own the rendering device, we share it with the sdk, therefore any code that touches it should be public so we know what it's messing with.
Steam's OpenVR sdk allows users to create their own eye render textures and hand them to the sdk's submit method. It would be interesting to see how their oculus compatibility handles that, are they doing a memory copy internally when using a rift? Or have they found a way around it? - broozarHonored Guest
"cybereality" wrote:
It seems the only option is to manually copy the texture over every frame.
Thanks for clarifying. Is there a "recommended"/"efficient" way to do this? If so, i'd be grateful for a code snippet :) since the TexSubImage approach did not seem to work for me.
if not, well, you know what's on my wishlist for SDK 1.1..."kojack" wrote:
Steam's OpenVR sdk allows users to create their own eye render textures and hand them to the sdk's submit method. It would be interesting to see how their oculus compatibility handles that, are they doing a memory copy internally when using a rift? Or have they found a way around it?
how's your performance when doing steamVR to OVR? even though I run at 60 FPS, the lag and jitter is unbearable. tried with openVR 0.9.14 sample scene. - thewhiteambitAdventurerJust copy them with frameBuffer rendering in OpenGL, or with the DeviceContext in Direct3D. Also drawing fullscreen texture quads works for both. You can not avoid this overhead in environments were the rendertargets can't be generated from the Oculus Runtime textures. This can get even more complicated, if you don't have hand over sRGB rendering of the engine itself. Drawing a fullscreen Quad can be more feasible in that case. :roll:
- I've made some progress on this.
From what I can tell, the textures specified in the layer given to ovr_SubmitFrame() are never used.
The usual steps to rendering with the rift:
- make a texture set per eye
- call to ovr_CreateSwapTextureSetD3D11() to fill in each texture set (currently it always allocates two textures in the ring buffer)
- make a layer (such as eyefov layer), giving it the texture sets in the ColorTexture array member.
- now call ovr_SubmitFrame every frame, giving it the layer to display.
My attempts to give it an external texture to display involved modifying the texture set pointers before the layer is made. But the textures in the texture sets are never used by ovr_SubmitFrame, the oculus sdk seems to be keeping an internal pointer the the textures made by ovr_CreateSwapTextureSetD3D11 and is always using them. I even tried replacing the texture pointers inside of the layer's texture sets with nulls, it still rendered the original textures fine.
Is this a bug? The only way to pass the eye textures to the ovr_SubmitFrame call is via the texture sets in the layer object, but it's ignoring whatever we put in there and using a previous hidden value instead.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 months ago
- 10 months ago
- 10 months ago
- 5 months ago
- 10 months ago