I've been experimenting with VR development since getting a Quest 2 and the frustration is growing... I've tried Unity and Unreal (I've worked with, and am familiar, with both) and each have presented a lot of challenges. On balance though Unity seems to be by far the more popular toolset (judging by the volume of posts on the Unity and Unreal specific forums here), so I've been focusing on that.
The trouble is, Unity doesn't really work for VR on Mac; running in editor (playmode) fails due to missing DLLs and, according to Unity, is unsupported. That removes one of Unity's biggest pluses over Unreal in my opinion... I've stuck with it anyway as the Oculus Integration for Unity and related SDKs seem to have much broader support than the Unreal integration. But that's meant switching to Windows, and the only Windows machine I have is my gaming laptop, which is a few years old now.
Even on Windows, though, Unity's in-editor play mode only seems to work when sticking to open standards; OpenXR and XR Interaction Toolkit functionality seems to work, but APIs from the Oculus Integration package mostly don't. That means I still end up having to build and deploy to the Quest rather than using play mode (making the switch to Windows much less worthwhile). Iteration times are 15-20 minutes for every little change, which is crippling when learning, exploring and prototyping.
I've begun to wonder if the apparent lack of feature parity in the Unreal integration compared to the Unity integration for Oculus is because on Unreal you're working in C++ and ,maybe, can interface with the native APIs directly, meaning there's less need to provide UE specific API wrappers?
What have your experiences been getting started in VR with Oculus ?