Hi – I’m Tom Heath, I’ve been with Oculus since 2013, and am very enthusiastic about VR! Those early days saw a lot of developing and pioneering for VR, and I still continue with that to this day, especially exploring locomotion in VR. My role at Oculus includes a lot of time working with developers worldwide, to make the best VR products possible – it’s great to work with the array of talent that exists. My background is in the games industry where I spent 20 years in a variety of senior management, production and programming roles, and I’m also interested in gaming educational products. Really excited to be answering questions and chatting with you today!
Thank you for jumping on! So I am facing a problem with camera movement in a 3rd person game I am working on. I am still incredibly confused on how something like Dispatch for the rift can handle movement in 1st person with little to no motion sickness while the Adventure Time 3rd person VR cam system can still cause sickness. Here is an example of the movement I am working on: https://youtu.be/TYZTLCgrtbw?t=1m48s
Got a slew of questions here, so feel free to answer to any degree.
How has Oculus been embracing locomotion modes internally? What're most peoples reactions when trying out full locomotion on titles without previous experience? What are you currently experimenting with in terms of locomotion?
What are the top recommended comfort modes that Oculus has been pursuing with free locomotion games? What have you folks found in terms of what works and doesn't? People vary so wildly with their nausea / comfort thresholds so it's obviously not a silver bullet but was hoping you could chime in to give the community some details.
Hello, Im New To This Platform. Im Looking To See How To Get More Involved With The Gaming Industry & Development Of New Age Tech? Im All About The Teachings Of VR & The Possibilities As To Forward Thinking. Im Limited With Local Resources & Political Views As To Opportunities In My "Small" Community. As Well As Navigating & Enrollment With The Site & Features As I Do Not Own Any VR Or Computer Equipment That Can Process These Graphics. I Know Your New Launch Of Product, Which Doesn't Help In My Situation For Affordability/Income Potential. What Is The Best Recommended "Avenue" To Pursuing/Maximizing My "Oculus" Experiences & Promotional Opportunities?
1. I love that medium allows post 3d content facebook - Are you planning to open up this of functionality for devs, so I could include 3d post sharing in my own app ?
2. Medium currently forces users to use lightning and shadows in the workspace - and it's giving bad results with photogrammetry. Is there any method to use models with shadowless settings - as it is on Sketchfab ?
Hello, we're currently stuck on a Gear VR issue that is holding up our research project. This is being built in Unity 2017.3, and according to the Profiler we're wasting a majority of our CPU time on something called "XR.WaitForGPU()" - do you have any insight on what this means? I believe it is related to our framerate locking at 30fps and frame interpolation, but since we appear to make our 16ms frame time other than WaitForGPU(), we're stuck on answers. Are there any known bugs with this? Can it be disabled during dev?
I'm not sure if you mean that the video is inherently out of focus, or if it is located at a particular fixed depth (but sharp) that is perhaps at odds with the 3D content of the scene. Assuming the latter, you can indeed decide at what depth that video is seen by translating the eye images horizontally closer or further apart. This at least lets you decide where in space the video is presented, but alas it has no depth within it, so I guess that is a limitation. This is mitigated further by the further away you present the depth, because our eyes have less ability to resolve depth at greater distances.