How to geometrically align a 3D model to real furniture?
For my video-see-through AR application I want a model to automatically be placed on the biggest found table surface. I achieved the first step: The model is correctly positioned at the table center. However, the orientationis not correct. I want the model to "lie down" on the table, means laying on the back, facing up, while the longest side of the model is orientated just as the longest side of the table - see the following pictures: After a very long time trying, I could not figure out how to align the model correctly. If you have any idea/hint/ or clue I could try, please let me know. Used Asset for testing: Low Poly Human I added an Empty (called ModelPlacer) where I added the script (see below) and pulled this asset in the modelInstance field. Used Meta XR Building Blocks: Camera Rig Passthrough MR Utility Kit Scene Debugger Effect Mesh Hand Tracking Controller Tracking Technical Specifications: Unity 2022.3.50f1 VR-Glasses: Meta Quest 3 Meta MR Utility Kit Code: using Meta.XR.MRUtilityKit; using System.Collections; using System.Collections.Generic; using UnityEngine; public class ModelPlacer : MonoBehaviour { public GameObject modelPrefab; private GameObject modelInstance; void Start() { MRUK.Instance?.RegisterSceneLoadedCallback(OnSceneLoaded); } private void OnSceneLoaded() { SpawnModel(); AlignModelWithSurface(); } public void SpawnModel() { Vector3 spawnPosition = new Vector3(0.0f, 1.0f, -1.0f); modelInstance = Instantiate(modelPrefab, spawnPosition, Quaternion.identity); } public void AlignModelWithSurface() { var largestSurface = MRUK.Instance?.GetCurrentRoom()?.FindLargestSurface(MRUKAnchor.SceneLabels.TABLE); if (modelInstance != null) { if (largestSurface != null) { modelInstance.transform.SetParent(largestSurface.transform); Renderer modelRenderer = modelInstance.GetComponent<Renderer>(); modelInstance.transform.rotation = Quaternion.Euler(-90, 0, 0); Vector3 modelCenter = modelRenderer.bounds.center; Vector3 surfaceCenter = largestSurface.transform.position; Vector3 positionOffset = surfaceCenter - modelCenter; Vector3 adjustedPosition = modelInstance.transform.position + positionOffset; modelInstance.transform.position = adjustedPosition; } else { Debug.LogWarning("No surface found."); } } else { Debug.LogWarning("modelInstance is null."); } } }917Views0likes3CommentsPresence Platform Scene Manager SceneModelLoadedSuccessfully not being called
Hello, I'm trying to get Scene data, so following the tutorials, I wrote a script that waits for the scene to be loaded and then iterate over the SceneAnchors. It worked in one project. I took this c# script and moved it to another project, duplicated the bouncing ball demo scene and added my script there. My script is pointing to the OVRSceneManager that exists there in the bouncing ball scene. In the logs, with verbose on, I can see the scene is being loaded and finishes successfully, with the OVRSceneManager object but the SceneModelLoadedSuccessfully not being called in my object. This is in my code if (_sceneManager == null) { _sceneManager = FindObjectOfType<OVRSceneManager>(); } VRDebugger.Log(" Scene Manager" + _sceneManager); _sceneManager.SceneModelLoadedSuccessfully += SceneModelLoaded; What am I doing wrong?1.6KViews0likes2CommentsHide OVRScene object guardian outlines
I have an AR app where users interact with surfaces, but annoying outlines of OVRSceneAnchor objects are getting in the way. I don't want to have to ask users to enable developer mode and disable their guardians. Is there some way I can turn these guardian outlines off? The user already knows something is there because passthrough is enabled793Views0likes0CommentsMixedReality recentered pose problem.
Hi, We are developing a mixed reality experience using SceneModel to set up the user room. We use the virtual model that the SceneModel instantiate to calculate the distance between the player and its room walls. We need that the player was in a position that must satisfy specific distance rules. In case that the player was in an incorrect position, we want that the player recenter its position. However, when the user use the recenter option, our virtual model is updated and the virtual walls do not correspond their position with the real position. For instance, in the image above, we have in the left side of the image our player initial position, represented by the circle with the arrow (the arrow is the direction the user is looking at). The black rectangle represents the real room walls and the blue one the virtual room walls. As the image show, when the user moves in its room and recenters the virtual room walls update their transform generating undesirable situation. We want to know if there is a special configuration in the OVRSceneManager component or in OVRSceneAnchor component or whatever Oculus Unity SDK component to avoid this behaviour.881Views0likes0CommentsMixed reality only work in the first scene
When a mixed reality scene element OVRSceneAnchor is destroyed, it get ignored during the whole game session. Opening Mixed reality a second time in another Unity scene is impossible, or very laborious. This is because of recent changes in the Unity Integration SDK in OVRSceneAnchor.cs. Problem is reproducible on v48 & v49. Not on v43. I removed the line : DestroyedSceneAnchors.Add(this.Uuid); in OVRSceneAnchor.cs line 209 for a workaround. But maybe there is something I'm missing ?1.1KViews1like1Comment