cancel
Showing results for 
Search instead for 
Did you mean: 

Tom Heath - Development Engineer

TomHeath
Level 4
Hi
– I’m Tom Heath, I’ve been with Oculus since 2013, and am very enthusiastic
about VR!  Those early days saw a lot of developing and pioneering for VR,
and I still continue with that to this day, especially exploring locomotion in
VR.   My role at Oculus includes a lot of time working with
developers worldwide, to make the best VR products possible – it’s great to
work with the array of talent that exists.    My background is
in the games industry where I spent 20 years in a variety of senior management,
production and programming roles, and I’m also interested in gaming educational
products.     Really excited to be answering questions and
chatting with you today!
67 REPLIES 67

TomHeath
Level 4
We haven't announced those details yet, but stay tuned to our website for more info as it becomes available!

Hey @TomHeath

Thank you for jumping on!
So I am facing a problem with camera movement in a 3rd person game I am working on.
I am still incredibly confused on how something like Dispatch for the rift can handle movement in 1st person with little to no motion sickness while the Adventure Time 3rd person VR cam system can still cause sickness.
Here is an example of the movement I am working on:
https://youtu.be/TYZTLCgrtbw?t=1m48s

Any thoughts on this Tom?

Thanks!

DG-Bryan
Level 2
Hi Tom!

Got a slew of questions here, so feel free to answer to any degree.

How has Oculus been embracing locomotion modes internally?  What're most peoples reactions when trying out full locomotion on titles without previous experience?  What are you currently experimenting with in terms of locomotion?

What are the top recommended comfort modes that Oculus has been pursuing with free locomotion games?  What have you folks found in terms of what works and doesn't?  People vary so wildly with their nausea / comfort thresholds so it's obviously not a silver bullet but was hoping you could chime in to give the community some details.

Thanks!

howardslaw
Level 2
Hello, Im New To This Platform. Im Looking To See How To Get More Involved With The Gaming Industry & Development Of New Age Tech? Im All About The Teachings Of VR & The Possibilities As To Forward Thinking. Im Limited With Local Resources & Political Views As To Opportunities In My "Small" Community. As Well As Navigating & Enrollment With The Site & Features As I Do Not Own Any VR Or Computer Equipment That Can Process These Graphics. I Know Your New Launch Of Product, Which Doesn't Help In My Situation For Affordability/Income Potential. What Is The Best Recommended "Avenue" To Pursuing/Maximizing My  "Oculus" Experiences & Promotional Opportunities?

nigel_budden
Level 2
Hi Tom, thanks for doing this AMA, I'm curious about when Deep linking will be available within the Rift eco system, I see it's been made available for GearVR, and hoping Rift won't be far off.

Adam123321
Level 3
Questions about Medium and 3d posts on Facebook:
1. I love that medium allows post 3d content facebook - Are you planning to open up this of functionality for devs, so I could include 3d post sharing in my own app ?

2. Medium currently forces users to use lightning and shadows in the workspace - and it's giving bad results with photogrammetry.
Is there any method to use models with shadowless settings - as it is on Sketchfab ?

Cheers !

justiceinc
Level 4
Hello, we're currently stuck on a Gear VR issue that is holding up our research project. This is being built in Unity 2017.3, and according to the Profiler we're wasting a majority of our CPU time on something called "XR.WaitForGPU()" - do you have any insight on what this means? I believe it is related to our framerate locking at 30fps and frame interpolation, but since we appear to make our 16ms frame time other than WaitForGPU(), we're stuck on answers. Are there any known bugs with this? Can it be disabled during dev?

Adam123321
Level 3
And one more questions on the presentation at OC4 we saw 3d posts with some interactive elements like car doors in this example: pic.twitter.com/kqOXHp93zE

Is this feature already available in medium - if it's where can I find info about it, and if it's not - is there any way to achieve this effect right now ?




TomHeath
Level 4
Hi - alas not one of my areas of expertise, but I've noted your issues, and I'll flag them with my colleagues to look into them further. The support team will be in touch.

TomHeath
Level 4
I'm not sure if you mean that the video is inherently out of focus, or if it is located at a particular fixed depth (but sharp) that is perhaps at odds with the 3D content of the scene. Assuming the latter, you can indeed decide at what depth that video is seen by translating the eye images horizontally closer or further apart. This at least lets you decide where in space the video is presented, but alas it has no depth within it, so I guess that is a limitation. This is mitigated further by the further away you present the depth, because our eyes have less ability to resolve depth at greater distances.