Meta XR Simulator Standalone Help
I'm an educator teaching Unity & XR development using Quest 3 and Meta Building Blocks. But I have been really struggling because of the difference in learning materials online (from Unity's end, and from content creators online from even just 3 months ago, let alone 1 year).
The most current / pressing issue in my class is the lack of updated documentation and examples using the new standalone version of the Meta XR Simulator. Half the documents in the official Meta XR Simulator Overview documentation are from 2024 and use the old interface (which had WAY more features and customization options). I have a bunch of students relying on mouse and keyboard controls trying to test behaviors like the locomotion building block, but they don't work.
Current issues I would love suggestions or hints on how to solve (from just importing Building Blocks into a Unity Core 3D scene, nothing customized yet):
- I have duplicate controller models and ghosting (only in the simulator, visible when moving)
- I have weird graphical glitches sometimes that look like snow or fuzz (only in the Unity game view & simulator when running the simulator)
- I cannot get rays or aiming reticles to come from the controllers no matter where they or my mouse are pointing (but they work in the headset). Even with point and click on.
- Do the movement inputs (default WASD and Arrow Keys) simulate the left and right joystick? Or do they override/bypass those inputs?
- Some teleport control options involve aiming and pressing up on the joystick, and I'm not sure how to test that in the simulator
- Is there some way to add simulation input options that actually trigger the controller's inputs like the Unity package version used to?
I would also appreciate any general advice or resources on new/recent best practices, customization options, and debugging tips using the building blocks and interaction SDK.