05-06-2025 09:54 PM - edited 05-06-2025 09:56 PM
Hi
Soooo let me explain in details what I'm about to do.
I'm developing an app for meta quest AR. the main game play is like this: spawn some static meshes in the scene and save the transform data of each spawned meshes on a server and then load them back in at the exact location there were spawned.
here is the issue. When I load the static meshes, there is a big offset, and nothing is in place anymore. only the scanned environment (scene anchors) are correctly placed in the scene no matter where the initial boundary is, they always are in the right place because of Meta's sorcery, and I have no idea what logic is behind that. Anyways, as the next step I decided to use one of the anchors as a guide and get relative transform of the static meshes based on those persistent anchors.
that's why I came up with "inverse transform" node in unreal. here is the blueprint:
(static mesh transform *(floor anchor inverse transform)) ==>> I saved this transform to a slot
and on load I did this:
(saved transform * floor anchor transform)
well it worked but kind of...let me explain...when I create a new boundary at a different height let's say on stairs, I get an offset on the Z axis which is exactly the same height of the stair step I launched the new boundary at. The x and y location are ok though.
And I know I should use spatial anchor system for this procedure but for some reason I rather use the existing scanned environment and scene anchors which is already there in the headset.
I'm pretty sure that can be done. just get a relative transform between the static mesh and the floor anchor and that's it. I think the blueprint logic I used is wrong somehow.
Any idea will be greatly appreciated.