Forum Discussion
NervousEnergy
6 years agoExplorer
Quest development on a macOS
Short version: Is anyone doing Guest development on macOS with Unity or Unreal, and if so can you provide basic guidance on getting your dev environment setup?
Long version: My Quest is in the mail and I’d like to start exploring development options. I’m getting started in Oculus development and would like to avoid buying a Windows machine or relying exclusively on Parallels (or a Windows VM). While I’ve found some barebones guidance on working with Unity and Oculus via Mac its vague as to whether this will be applicable for the Quest.
If this is a simple RTFM [and here’s a link to it] then that would be appreciated too! A few solid directions should be all I need (famous last words).
- I've done Go development from my Mac, and Quest will function identically (I'll be doing Quest dev as well).
In short: it's a bit of a pain in the ass. But it's doable.
The setup is simple enough, and there are a number of guides available to follow to get things configured properly. I use Unity, so I'll be speaking in terms of that, but Unreal would be similar, just... a bit different.
You'll need to have the Android SDK, NDK, and Java SDK installed. If you're using Unity 2019, I believe the option is now provided to install and manage them through Unity itself, which is nice. Since you're posting here, I assume you already have the developer account. When you pair the phone, there'll be an option in the app to enable developer mode, which you'll want to enable.
There are a number of settings you'll want to configure to ensure the best performance on the mobile set. Do all of those.
Honestly, the worst part of this setup is that you can't preview inside the editor. Meaning you have to build to the device each time you want to test anything. This gets old quickly when you make one small change to try to figure out why your camera rig is a meter higher than it should be, only to realize the setting you changed didn't fix it. So you pull the headset off, mess with your settings, build again, and wait a few minutes for it to finish, pop the headset on, and check again. It's a bit tedious and much slower than it'd be with having a Rift working on your development machine.
The second worst part is not being able to live-debug things. When you're running in the editor, you can peek behind the curtain to see why stuff isn't working. That's a hell of a lot harder to do when you're forced to run on the actual device.
This alone is making me crunch the numbers to see if I can get a gaming PC laptop and a Rift S. I want to make something awesome for the Quest, but I don't have the patience to see a big project through the whole way doing this back-and-forth, and all the guesswork involved with debugging. That said, if it's your only option, it's certainly doable, but be prepared to have your patience tested.
4 Replies
- Schneider21Expert ProtegeI've done Go development from my Mac, and Quest will function identically (I'll be doing Quest dev as well).
In short: it's a bit of a pain in the ass. But it's doable.
The setup is simple enough, and there are a number of guides available to follow to get things configured properly. I use Unity, so I'll be speaking in terms of that, but Unreal would be similar, just... a bit different.
You'll need to have the Android SDK, NDK, and Java SDK installed. If you're using Unity 2019, I believe the option is now provided to install and manage them through Unity itself, which is nice. Since you're posting here, I assume you already have the developer account. When you pair the phone, there'll be an option in the app to enable developer mode, which you'll want to enable.
There are a number of settings you'll want to configure to ensure the best performance on the mobile set. Do all of those.
Honestly, the worst part of this setup is that you can't preview inside the editor. Meaning you have to build to the device each time you want to test anything. This gets old quickly when you make one small change to try to figure out why your camera rig is a meter higher than it should be, only to realize the setting you changed didn't fix it. So you pull the headset off, mess with your settings, build again, and wait a few minutes for it to finish, pop the headset on, and check again. It's a bit tedious and much slower than it'd be with having a Rift working on your development machine.
The second worst part is not being able to live-debug things. When you're running in the editor, you can peek behind the curtain to see why stuff isn't working. That's a hell of a lot harder to do when you're forced to run on the actual device.
This alone is making me crunch the numbers to see if I can get a gaming PC laptop and a Rift S. I want to make something awesome for the Quest, but I don't have the patience to see a big project through the whole way doing this back-and-forth, and all the guesswork involved with debugging. That said, if it's your only option, it's certainly doable, but be prepared to have your patience tested. - NervousEnergyExplorerThank you! This is exactly what I needed to find my footing. I appreciate the additional advice - and those links. My guess is I'll see how far I get with the back-and-forth and how much of a pain point that becomes versus how much I'm capable of building.
I may take the opportunity to document my efforts. - Schneider21Expert ProtegeFor what it's worth, Matt Conte indicated in his recent AMA (in response to yours truly) that in the near future, developing for mobile standalone without having a Rift may become less of a painful process.
...improving iteration time natively on-device is one of our top priorities this year (once we get the hardware out the door!). I don't have anything i can talk about right at this moment, but we are heavily focused on making this more palatable during development. There's a lot of tips and tricks we'd like to share once things go public, and the forums are a great resource for things like working around long shader variant compiles. We're working on it!
This doesn't give much of an indication on exactly what they're working on, but my sincere hope (that I mentioned in the question) was a streaming remote capability that would allow you to stream to the headset in play mode, albeit with limited resolution, framerate, and latency. This would be similar to how the Android remote app works with Unity, so it seems doable from my limited viewpoint.
To be clear, this isn't what he promised, and is just what I'm hoping for. But keep your eyes peeled for whatever it is they end up doing.
Good luck, and enjoy your Quest when it arrives! Make something cool for it! - JattierProtegeI have had (almost) no trouble making a project and deploying to the Quest using Unity and a MacBook Pro.
I have not been able to figure out how to do the same thing with Unreal. It doesn't seem like it's possible to enable the Oculus plugin.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device