cancel
Showing results for 
Search instead for 
Did you mean: 

Tom Heath - Development Engineer

TomHeath
Protege
Hi
– I’m Tom Heath, I’ve been with Oculus since 2013, and am very enthusiastic
about VR!  Those early days saw a lot of developing and pioneering for VR,
and I still continue with that to this day, especially exploring locomotion in
VR.   My role at Oculus includes a lot of time working with
developers worldwide, to make the best VR products possible – it’s great to
work with the array of talent that exists.    My background is
in the games industry where I spent 20 years in a variety of senior management,
production and programming roles, and I’m also interested in gaming educational
products.     Really excited to be answering questions and
chatting with you today!
67 REPLIES 67

TomHeath
Protege
I will look further. Would raw accelerometer etc. data help in your opinion? Also, I am wondering if your Oculus sensors exist within the motion of the system, or whether they are static and separate - I would recommend the latter. 

TomHeath
Protege
Thanks very much everyone, for joining me. Its been really fun and interesting. Hopefully see many of you at GDC next month, I'll be giving a talk at the Oculus booth. Carry on all your great work!! Best wishes, Tom.

luke_sigtrap
Explorer
Hi Tom! I'm lead programmer at a (tiny) studio called Sigtrap, and we made a zero-g 6DoF shooter called Sublevel Zero. We did huuuuge amounts of R&D on sim-sickness and mitigating it, and found that "tunnelling", amongst other things, was a really effective solution. It was great reading your blog post on locomotion research, and good to see that other people are coming to similar conclusions to us.

It was also really interesting reading the sections on "window into the moving world" and "portal into the static world" - they're really useful extrapolations of the tunnelling concept. As such, we're actually about to release a Unity plugin called VR Tunnelling Pro, which draws on both our own research and yours, implementing a raft of different tunnelling methods which we hope will really help devs make comfort features - and player customisation of such - a standard part of VR games.

So, many thanks for sharing the research! More info on the plugin (and a trailer) here if you're interested:
www.sigtrapgames.com/vtrp/

beep2bleep
Protege
I recently built a very simple Unity game using the SteamVR camera and interaction system.  I want to have 6 dof tracking of the headset and show controllers.  I found it worked correctly with Vive/Rift/WMR but only in SteamVR.  Even though it's a simple game with very minimial requirements it looks like I'll have to build significantly seperate code/prefabs to support native Rift and WMR. 

Do you (or anyone else in the thread) have any suggestions in Unity that can support Native Rift and SteamVR and possibly WMR with a single set of prefabs and code base?  At a minimum I'd be happy with just have 6 Dof tracking of headset and controllers.  I'm happy to do custom handling of the controller inputs.

Everywhere I look has 3 completely different paths for handling the 3 systems.

anscarlett
Explorer
Hi Tom,

I have worked in the virtual reality field for the last 15 year, mostly for engineering design but also for training systems. I work with all kinds of HMD devices, both professional and consumer, as well as projection systems ranging from single screen tracked display walls to full multi projector 5 sided cube systems.

I am curious as to why in the new Oculus home, locomotion is limited to teleporting, when with Oculus touch having two analog sticks, it would be trivial to provide a means of walking using two axes with a third for orientation.

A  flying technique can also be simply created  by utilising the angular vector returned by one of the controllers to provide a directional component, and an analogue stick for the velocity component.

From my personal experience, I find that most users will stay primarily in one place in their environment, utilising the head 6dof as a means to look around, but using virtual locomotion to reposition themselves into a comfortable position and direction.

Do you find things work differently when used in an entertainment system compared to my view from an engineering standpoint?

Thanks,

Adrian.

luke_sigtrap
Explorer
Hi Tom! I'm lead programmer at a (tiny) studio called Sigtrap, and we made a zero-g 6DoF shooter called Sublevel Zero. We did huuuuge amounts of R&D on sim-sickness and mitigating it, and found that "tunnelling", amongst other things, was a really effective solution. It was great reading your blog post on locomotion research, and good to see that other people are coming to similar conclusions to us.

It was also really interesting reading the sections on "window into the moving world" and "portal into the static world" - they're really useful extrapolations of the tunnelling concept. As such, we're actually about to release a Unity plugin called VR Tunnelling Pro, which draws on both our own research and yours, implementing a raft of different tunnelling methods which we hope will really help devs make comfort features - and player customisation of such - a standard part of VR games.

So, many thanks for sharing the research! More info on the plugin (and a trailer) here if you're interested:
www.sigtrapgames.com/vtrp/

beep2bleep
Protege
I recently built a very simple Unity game using the SteamVR camera and interaction system.  I want to have 6 dof tracking of the headset and show controllers.  I found it worked correctly with Vive/Rift/WMR but only in SteamVR.  Even though it's a simple game with very minimial requirements it looks like I'll have to build significantly seperate code/prefabs to support native Rift and WMR. 

Do you (or anyone else in the thread) have any suggestions in Unity that can support Native Rift and SteamVR and possibly WMR with a single set of prefabs and code base?  At a minimum I'd be happy with just have 6 Dof tracking of headset and controllers.  I'm happy to do custom handling of the controller inputs.

Everywhere I look has 3 completely different paths for handling the 3 systems.

anscarlett
Explorer
Hi Tom,

I have worked in the virtual reality field for the last 15 year, mostly for engineering design but also for training systems. I work with all kinds of HMD devices, both professional and consumer, as well as projection systems ranging from single screen tracked display walls to full multi projector 5 sided cube systems.

I am curious as to why in the new Oculus home, locomotion is limited to teleporting, when with Oculus touch having two analog sticks, it would be trivial to provide a means of walking using two axes with a third for orientation.

A  flying technique can also be simply created  by utilising the angular vector returned by one of the controllers to provide a directional component, and an analogue stick for the velocity component.

From my personal experience, I find that most users will stay primarily in one place in their environment, utilising the head 6dof as a means to look around, but using virtual locomotion to reposition themselves into a comfortable position and direction.

Do you find things work differently when used in an entertainment system compared to my view from an engineering standpoint?

Thanks,

Adrian.