02-20-2025 04:09 AM - edited 02-20-2025 04:10 AM
Hi everyone,
I’m developing a mixed-reality application in Unity 6 that demands highly precise tracking. The goal is to position virtual objects at specific world locations and have them remain fixed reliably. My current approach involves:
Despite several months of testing, I’ve noticed a drift of a few centimeters, particularly when moving further away from the anchors and then returning. I’m looking for suggestions on how to address this issue. Specifically:
I have tried to find some insights, particularly with an external tracker in the face of the VIVE Ultimate Trackers, however, it seems that all of the trackers I find are not supported for Quest 3 and only for PC VR, to track body-movements.
Any insights, experiences, or recommendations would be greatly appreciated. Thanks in advance for your help!
02-20-2025 12:12 PM
Sorry I can't help but I have a project I want to start on that requires stable world-anchors so I'm very interested about this issue as well.
I'm guessing your 3-anchor idea was to try to to average out the drift between the three anchors for a more stable combined anchor. Though it seems to me that would only help if the drift was somewhat random, and would not be viable if the drift accumulates over time. So I'm very curious, did it actually help the drift noticably compared to just using one anchor, and how does it compare over time?
Meta did say they are releasing passthrough camera access to developers sometime "early" this year, so there may be a way to use fiducial markers or image feature recognition to help keep the origin stable, though we have no idea how accurate that would be.
I will point out that even before the Q3 launched more than a year ago, Meta had been advertising Augments which are supposed to go live shortly after release, and were basically just virtual objects that stay anchored in a passthrough home environment. However shortly after release it publicly got delayed, and then delayed again, and now it's been months since I have last heard anybody mention them. Given how trivial it would be to implement that if world anchors were working correctly, I think it's fair to assume that Meta has had technical problems keeping world anchors stationary with their own technology, which doesn't bode well for our chances at solving it. But it does also mean that they are probably actively working on the issue, and hopefully will eventually find a solution.
02-20-2025 02:23 PM
I did make a couple of tests and it seemed that one anchor causes more regular drifting than having 3 anchors and using the middle point for as a starting location. To be honest, there is a lot of randomness in the anchors.
Lasor Limbo managed to make a some sort of a stable tracking for their game. They even have their own depth system which means that the tracking should be stable enough to handle that. When I asked them how they did it, the response was: "Clearing the boundary cache of the Quest can help in general, but it won't solve the problem. However Metas anchors system is just not really good for larger spaces. So we created an own method to realign the space once drift happened. It helps but also is a bit cumbersome. "
I was curious about what method they used, but they didn't respond. If you have any ideas, please share them!
Also, could you refer where you read that the passthrough camera access would be released this year? If this is true, then that's really great for my application
02-20-2025 02:54 PM
It sounds that Laser Limbo response was just referring to more long-term drift. I have experimented a lot with room scanning and sometimes the rooms will stay put for days and then suddenly one day you turn on the headset and your living room scan is 20 meters away, or sometimes its almost correct but rotated 20 degrees off. In that case I could imagine a manual realignment process, like adding a feature to use controllers to translate/rotate it back into place at the start of the play session, would be a cumbersome but viable solution. Also that is an issue that Meta has recommended to clear the boundary cache to try and resolve in the past, so that might be all they are referring to here, but it's hard to say with their response being so vague.
That sounds like a different issue to me than what you are talking about. Or maybe they are related, but that solution is obviously not viable for what you are talking about.
The passthrough dev access was announced at the last Meta Connect event, and they made a brief mention of it in this Meta Connect dev recap blog post (search for "Passthrough Camera Access API"): https://developers.meta.com/horizon/blog/unlock-new-possibilities-in-mixed-reality