How to geometrically align a 3D model to real furniture?
For my video-see-through AR application I want a model to automatically be placed on the biggest found table surface. I achieved the first step: The model is correctly positioned at the table center. However, the orientationis not correct. I want the model to "lie down" on the table, means laying on the back, facing up, while the longest side of the model is orientated just as the longest side of the table - see the following pictures: After a very long time trying, I could not figure out how to align the model correctly. If you have any idea/hint/ or clue I could try, please let me know. Used Asset for testing: Low Poly Human I added an Empty (called ModelPlacer) where I added the script (see below) and pulled this asset in the modelInstance field. Used Meta XR Building Blocks: Camera Rig Passthrough MR Utility Kit Scene Debugger Effect Mesh Hand Tracking Controller Tracking Technical Specifications: Unity 2022.3.50f1 VR-Glasses: Meta Quest 3 Meta MR Utility Kit Code: using Meta.XR.MRUtilityKit; using System.Collections; using System.Collections.Generic; using UnityEngine; public class ModelPlacer : MonoBehaviour { public GameObject modelPrefab; private GameObject modelInstance; void Start() { MRUK.Instance?.RegisterSceneLoadedCallback(OnSceneLoaded); } private void OnSceneLoaded() { SpawnModel(); AlignModelWithSurface(); } public void SpawnModel() { Vector3 spawnPosition = new Vector3(0.0f, 1.0f, -1.0f); modelInstance = Instantiate(modelPrefab, spawnPosition, Quaternion.identity); } public void AlignModelWithSurface() { var largestSurface = MRUK.Instance?.GetCurrentRoom()?.FindLargestSurface(MRUKAnchor.SceneLabels.TABLE); if (modelInstance != null) { if (largestSurface != null) { modelInstance.transform.SetParent(largestSurface.transform); Renderer modelRenderer = modelInstance.GetComponent<Renderer>(); modelInstance.transform.rotation = Quaternion.Euler(-90, 0, 0); Vector3 modelCenter = modelRenderer.bounds.center; Vector3 surfaceCenter = largestSurface.transform.position; Vector3 positionOffset = surfaceCenter - modelCenter; Vector3 adjustedPosition = modelInstance.transform.position + positionOffset; modelInstance.transform.position = adjustedPosition; } else { Debug.LogWarning("No surface found."); } } else { Debug.LogWarning("modelInstance is null."); } } }932Views0likes3CommentsExpanding Scene Actors
Hi there, I'd like to expand the use of Scene actors to be able to replace Scene elements with any Blueprint and not be limited to static meshes (eg. I want to be able replace the user's table with a table BP that I can control to open drawers, place things on it, etc.). I'm hitting a wall with the provided OculusXRSceneActor not being expandable (nothing in it is virtual) and I can't duplicate it because it uses API entry points which are in the private section of the plugin (specifically OculusXRAnchorManager set of static functions). At that point, I feel like I have to switch to source in order to get what I want (which I'd rather avoid at that stage), because nothing is allowed beyond the (quite restricted) options showcased in the demo projects. I guess another thing I could do is spawn the anchors as they are by default (with an empty static mesh), then collect them and spawn my own BP based on their location / scale, but that's quite dirty to begin with, and since there are no available callbacks to hook on to know when the scene is done populating I'd have to check on tick, which makes it dirtier / less performant. Any help / insight for me? Would it be possible to edit the plugin to make some things accessible without compiling the whole source? Thanks!3.9KViews0likes12CommentsPresence Platform Scene Manager SceneModelLoadedSuccessfully not being called
Hello, I'm trying to get Scene data, so following the tutorials, I wrote a script that waits for the scene to be loaded and then iterate over the SceneAnchors. It worked in one project. I took this c# script and moved it to another project, duplicated the bouncing ball demo scene and added my script there. My script is pointing to the OVRSceneManager that exists there in the bouncing ball scene. In the logs, with verbose on, I can see the scene is being loaded and finishes successfully, with the OVRSceneManager object but the SceneModelLoadedSuccessfully not being called in my object. This is in my code if (_sceneManager == null) { _sceneManager = FindObjectOfType<OVRSceneManager>(); } VRDebugger.Log(" Scene Manager" + _sceneManager); _sceneManager.SceneModelLoadedSuccessfully += SceneModelLoaded; What am I doing wrong?1.6KViews0likes2CommentsHide OVRScene object guardian outlines
I have an AR app where users interact with surfaces, but annoying outlines of OVRSceneAnchor objects are getting in the way. I don't want to have to ask users to enable developer mode and disable their guardians. Is there some way I can turn these guardian outlines off? The user already knows something is there because passthrough is enabled795Views0likes0CommentsMixedReality recentered pose problem.
Hi, We are developing a mixed reality experience using SceneModel to set up the user room. We use the virtual model that the SceneModel instantiate to calculate the distance between the player and its room walls. We need that the player was in a position that must satisfy specific distance rules. In case that the player was in an incorrect position, we want that the player recenter its position. However, when the user use the recenter option, our virtual model is updated and the virtual walls do not correspond their position with the real position. For instance, in the image above, we have in the left side of the image our player initial position, represented by the circle with the arrow (the arrow is the direction the user is looking at). The black rectangle represents the real room walls and the blue one the virtual room walls. As the image show, when the user moves in its room and recenters the virtual room walls update their transform generating undesirable situation. We want to know if there is a special configuration in the OVRSceneManager component or in OVRSceneAnchor component or whatever Oculus Unity SDK component to avoid this behaviour.881Views0likes0CommentsMixed reality only work in the first scene
When a mixed reality scene element OVRSceneAnchor is destroyed, it get ignored during the whole game session. Opening Mixed reality a second time in another Unity scene is impossible, or very laborious. This is because of recent changes in the Unity Integration SDK in OVRSceneAnchor.cs. Problem is reproducible on v48 & v49. Not on v43. I removed the line : DestroyedSceneAnchors.Add(this.Uuid); in OVRSceneAnchor.cs line 209 for a workaround. But maybe there is something I'm missing ?1.1KViews1like1Comment