KovalevBoris
2 months agoExplorer
Binding objects in reality using spatial anchors
Hi, I'm new to development, so maybe my problem is too simple. But still, thanks in advance for any recommendations!
I'm trying to create a very simple AR app for Meta Quest 3, following the official instructions on Meta's website. However, the result so far leaves much to be desired.
My goal is to fix an object from a prefab onto a table in the real room. Later, I also plan to add UI buttons to the object to enable interactions — but that comes later.
I'm using Unity version 6000.1.6f1. I’ve installed the Meta All-in-One SDK (v77) and decided to start by using the provided Building Blocks. I also installed the OpenXR plugin, as recommended.
In Player Settings, under Active Input Handling, I selected Input System Package.
In XR Plug-in Management, I selected OpenXR > Meta XR Feature Group.
Under OpenXR settings, in Enable Interaction Profiles, I selected: Oculus Touch Controller, Meta Quest Touch Pro Controller and Plus Controller Profile.
In OpenXR Feature Groups, I enabled:
- Meta XR Feature,
- Meta XR Foveation,
- Meta Subsampled Layout,
- Meta Quest Anchors,
- Boundary Visibility,
- Bounding Boxes,
- Camera (Passthrough),
- Display Utilities,
- Meshing,
- Occlusion,
- Planes,
- Raycast,
- Session.
In the scene, I added the following Building Blocks: Camera Rig, Passthrough, MR Utility Kit, Find Spawn Positions, Controller Tracking Left, and Controller Buttons Mapper.
I added my prefab object (currently without UI buttons) and connected it to Find Spawn Positions. I also configured labels like “table” and other placement surface definitions.
When I run the app, the room scan loads and the object appears as if it’s placed on a table — but the position is different every time, and sometimes the object ends up on the floor instead of the table.
I also tried writing a simple script: when pressing the B button on the controller, a beam should be fired and an anchor placed where the beam intersects with a surface (using my prefab object as the anchor). But nothing happened when I tried this, so I abandoned the idea and looked for something ready-made in the Building Blocks.
Eventually, I found the Instant Content Placement Tool. I added all three recommended objects to the scene: Environment Raycast Manager, Cube, Raycast Visualizer.
However, when I launched the scene, the controller-based placement didn’t work. The cube simply appeared at the player’s origin point and stayed on the floor, without any movement or interactivity.
Of course, I was also hoping to make the placed object stick to real-world surfaces and persist across sessions by saving its anchored position in the room scan — but I understand that this might still be far off.
Could you please suggest a clear algorithm or sequence of steps for achieving this?
It seemed like placing and saving an object in a room should be a simple task. But after spending a whole week trying to understand the Building Blocks and how everything connects, all I could manage was to place the object — and even that is inconsistent, with its position shifting slightly every time.