Forum Discussion
densohax
11 years agoExplorer
SDK and Abstraction
Ok, I'm implementing the sdk 0.4.0 in my cross-platform engine, and from the look of it, Oculus decided to implement the world (stealing multi-platform code from scaleform classes).
Why is that? Isn't that the worst design pattern in an embedded cross-platform environment?
Wouldn't you prefer giving us handlers on filesystem/network/renderer instead, so we could use our own device implementations?
Honestly, I'd prefer having c-strings with the shader codes, the heavy lifting of matrixes and the like done OVR side, but I'd like to do my own rendering from scratch.. Is that still possible with the current SDK? Is that something you'll eventually do, or are we slaves to your own implementation?
I'll admit I didn't dig very deep, but it would be nice to be able to disable parts of the SDK, I'd like to remove everything GL/DX from there and simply get life cycle callbacks and frame timing information from the SDK, having the renderer as a user friendly option if we wanted but do the rendering raw lifting on my side...
I don't know all the details of the SDK so I might be mistaken on a couple of points, if so, please let me know the correct way to achieve this! Thanks! (With the previous sdk I was able to do it quite nicely)
Why is that? Isn't that the worst design pattern in an embedded cross-platform environment?
Wouldn't you prefer giving us handlers on filesystem/network/renderer instead, so we could use our own device implementations?
Honestly, I'd prefer having c-strings with the shader codes, the heavy lifting of matrixes and the like done OVR side, but I'd like to do my own rendering from scratch.. Is that still possible with the current SDK? Is that something you'll eventually do, or are we slaves to your own implementation?
I'll admit I didn't dig very deep, but it would be nice to be able to disable parts of the SDK, I'd like to remove everything GL/DX from there and simply get life cycle callbacks and frame timing information from the SDK, having the renderer as a user friendly option if we wanted but do the rendering raw lifting on my side...
I don't know all the details of the SDK so I might be mistaken on a couple of points, if so, please let me know the correct way to achieve this! Thanks! (With the previous sdk I was able to do it quite nicely)
4 Replies
- densohaxExplorerWhat I am really suggesting, is another architecture that is more interface based..
As in:
libovr_core -> libovr_renderGL *optional
-> libovr_renderDX11 *optional
-> libovr_renderCustom *interface
-> libovr_tracker *interface
-> libovr_io *interface
*AND*
libovr_uberfriendly
A simple C interface-like dispatch table could be used, that we need to implement (libovr_renderCustom) or forward (ie.: libovr_renderGL)to the ovr implementation with our renderer trampoline in the dispatch if we like.. Not only is it more flexible, it's also removing useless dependencies and a bunch of ugly #ifdef in your code.
In libovr_io dispatch we could insert our file system / threading / networking implementations.
This would probably be the most sensible thing to do if you wanted to go that way in my opinion, and it would be much easier to integrate into existing frameworks.
libovr_uberfriendly, would be the full package for people that don't need the fined grained control in their apps. - rcmaniac25Honored GuestI'm still new on doing any development with the Rift, but Tiny Room demo seems to have the define "SDK_RENDER" which allows you to switch between the C++/SDK rendering (enabled by default) and the mainly C-style OVR API and do rendering yourself.
Might be a good starting point, as I too was a bit worried when they said they implemented a bunch of the rendering components into the SDK. I can't imagine Epic, Valve, Unity and other saying "yea, we don't have to do the distortion/etc. anymore, the SDK takes care of it for us" - densohaxExplorerYou are right, that's one way to do it, but I still believe they should organize their core by modules, and I have the feeling they'll do it.. I think I'll just wait for the next release, I don't want to waste my time adapting the implementation for another incoming api change.
- rcmaniac25Honored GuestEspecially since it's still in beta.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 3 years ago
- 1 year ago