Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨
CptPilot's avatar
CptPilot
Honored Guest
9 years ago

New developer: Tracking only if all in sight

I'm currently building a VR game in Unity but I'm having an issue with tracking. I'm testing some basic interactions and as I'm doing quick bursts of code (code something, then test it then code something), I've not been putting my headset on each test, however, this has led to some weird issues.

Neither of the controllers track position or input if the game can't see both controllers and the headset. For example, the headset is on my desk, but I'm holding the controller in front of me (so the sensors can see the controller) so I can test picking up an item.

However, unless the sensors can see the headset then the controller's position isn't tracked (rotation is mapped but position is not) and unless both touch controllers can be seen button presses aren't registered (using the test code Debug.Log(OVRInput.Get(OVRInput.Button.SecondaryHandTrigger));)  This seems really odd as they should all be independent pieces of data (particularly button input)

Given this is early days for Oculus development I wanted to see whether this is expected or not.

Thanks!

1 Reply

Replies have been turned off for this discussion
  • That is as expected, You'll also likely notice that haptics won't work unless both controllers are active.

    If I don't want to put the headset on when doing a quick test I cover the sensor on the headset with my thumb.