I'm one of the scores of people excited for the release of the Quest, and I wanted to start working on a game idea as early as possible. I've seen that Oculus is recommending developing for Quest by using a Rift, and ensuring your game is just optimized for mobile. The problem I have is that my home computer is a MacBook Pro, so it couldn't really work with the Rift even if I owned one.
Is there any way I could pick up a set of Touch controllers and a sensor or two and use those within the Unity editor for developing and testing until the Quest launches? Obviously without an HMD I wouldn't be able to get the full effect, but it's the lack of 6dof input that's holding me back right now. Does it even work this way, or do the Touch controllers actually communicate with the HMD directly and the sensors just act as corrective input or something?
Any insight onto this would be greatly appreciated.