Hello Oculus developers! I'm David Borel. I've been in graphics for >10 years at NVIDIA, Havok, zSpace, and 4 years next month at Oculus, where I lead the Engine Integrations team. We focus on core VR and performance in popular engines like Unity and Unreal. At OC4, I did a talk with Jian Zhang and Remi Palandri that covered some of what we do, including recent optimizations(https://www.youtube.com/watch?v=pjg309WSzlM).
I'm looking forward to your questions and will answer as many as possible. Let's do this!
Twitter: @Dave_Borel Oculus ID: vrdaveb
There may be a slight delay after you submit your questions. Please only submit once, we’ve received your message and it will be live in the thread shortly!
The native Rift and Mobile SDKs have always been developed by separate teams at Oculus. In the early days, the Mobile Unity plugin was actually based on the one we wrote for Rift. However, we have unified the engine integrations (and many other aspects of our developer experience) for Rift and Mobile to help developers transfer skills and projects between the two and to be more efficient internally. One nice thing about having two lines of products is that you can use one to develop for the other.
What are some technical features or best practices you would like to see implemented in most upcoming VR applications in 2018 and beyond that we hadn't seen much of before?
I like the visual style, but it looks like you're using a lot of specular shading, post-processing, alpha blending, and parallax or displacement mapping. That will clearly have a high per-pixel cost, but you may be able to improve it with some profile-guided optimization.
The Oculus Start program may include dev kits for existing and new hardware as well as beta tools and services. Please check out our Oculus Start page to learn more: https://developer.oculus.com/oculus-start/
What are the function calls to bind with HMDLost and HMDAcquired to have a Unity3D application recover gracefully if a user unplugs/plugs back in their USB connection to the headset?
There are a lot of features and even whole SDKs we offer that many apps don't seem to know about. For example, UE4's Stereo Layers and Unity's OVROverlay give you access to powerful compositing features in our PC and mobile SDKs, allowing you to show high-quality UIs and 360 images and videos that are independent of eye buffer rendering. We also offer SDKs for avatars, audio, and social to enhance your sense of presence. All of them have Unity and UE4 integrations, which are available on the Unity Asset Store and our UE4 Github. For best practices, please see the guide: https://developer.oculus.com/design/latest/concepts/book-bp
Unity (and Unreal) apps should already recover gracefully when you unplug and re-plug the Rift. If you want your app to exit VR mode, you can write a function to set UntiyEngine.VR.VRSettings.enabled = false when handling HMDLost and set it back to true in HMDAcquired.
Thanks for the AMA, is great to hear from someone who daily struggle with the integrations.
I do have a questions, mainly because I see a lot of API that you can access through Unity3D are not yet available for Unreal Engine, there is any reason? It is because more people uses Unity or because Unreal is harder to integrate?
We have been doing a lot of integrations ourselves here at VRMonkey, as we did a custom IAP plugin to handle purchases, but it would be amazing to know that I can depend on Oculus APIs.