Object placed above MRUK furniture "jumps" up/down when pushing right thumbstick
Context Unity + Meta XR Building Blocks. I'm building an AR app (Passthrough + MR Utility Kit). I show a world-space UI dialog above an MRUK table (Canvas in world space, RectTransform placed at the table center + slight lift to sit on the surface -> all via script). Symptom Whenever I push the right controller thumbstick downward, the dialog appears to "jump" ~0.5 m up, and pushing again makes it jump back down. This happened both on the device and in the Simulator. What it actually is It's not the dialog moving. Logging showed Camera.main.transform.position.y toggling between two values (1.047 <-> 1.547), while the dialog's world Y stayed constant.Solved53Views0likes1CommentFPS not getting more then 30
How can we increase fps of oculus quest. We are using xr interaction toolkit. In empty scene with just camera i am getting 55fps max. But in any project i am getting just 30fps. I have also done normal urp setting you can go though in photos attached. I also applied occlusion culling.126Views0likes2CommentsHow to geometrically align a 3D model to real furniture?
For my video-see-through AR application I want a model to automatically be placed on the biggest found table surface. I achieved the first step: The model is correctly positioned at the table center. However, the orientationis not correct. I want the model to "lie down" on the table, means laying on the back, facing up, while the longest side of the model is orientated just as the longest side of the table - see the following pictures: After a very long time trying, I could not figure out how to align the model correctly. If you have any idea/hint/ or clue I could try, please let me know. Used Asset for testing: Low Poly Human I added an Empty (called ModelPlacer) where I added the script (see below) and pulled this asset in the modelInstance field. Used Meta XR Building Blocks: Camera Rig Passthrough MR Utility Kit Scene Debugger Effect Mesh Hand Tracking Controller Tracking Technical Specifications: Unity 2022.3.50f1 VR-Glasses: Meta Quest 3 Meta MR Utility Kit Code: using Meta.XR.MRUtilityKit; using System.Collections; using System.Collections.Generic; using UnityEngine; public class ModelPlacer : MonoBehaviour { public GameObject modelPrefab; private GameObject modelInstance; void Start() { MRUK.Instance?.RegisterSceneLoadedCallback(OnSceneLoaded); } private void OnSceneLoaded() { SpawnModel(); AlignModelWithSurface(); } public void SpawnModel() { Vector3 spawnPosition = new Vector3(0.0f, 1.0f, -1.0f); modelInstance = Instantiate(modelPrefab, spawnPosition, Quaternion.identity); } public void AlignModelWithSurface() { var largestSurface = MRUK.Instance?.GetCurrentRoom()?.FindLargestSurface(MRUKAnchor.SceneLabels.TABLE); if (modelInstance != null) { if (largestSurface != null) { modelInstance.transform.SetParent(largestSurface.transform); Renderer modelRenderer = modelInstance.GetComponent<Renderer>(); modelInstance.transform.rotation = Quaternion.Euler(-90, 0, 0); Vector3 modelCenter = modelRenderer.bounds.center; Vector3 surfaceCenter = largestSurface.transform.position; Vector3 positionOffset = surfaceCenter - modelCenter; Vector3 adjustedPosition = modelInstance.transform.position + positionOffset; modelInstance.transform.position = adjustedPosition; } else { Debug.LogWarning("No surface found."); } } else { Debug.LogWarning("modelInstance is null."); } } }941Views0likes3CommentsReload Scene Mesh after Application Start
Hello everyone, my problem is that the room mesh doesn't update. I want the room mesh to be updated, everytime my application is started again. However, it seems like the mesh loaded from the first time is always reused and doesn't update, even when I update the application. How I update: I build an AR app (.apk file) on my laptop with Unity and deploy it to my Meta Quest 3 headset, which overrites the previous .apk file. What I see: a flickering mesh all over the place of the "original room", even when I go to a neighbouring room, it shows the mesh of the first room (it's like looking through a wall). I followed the set up instructions and my "Hello World" and first tries worked properly. For my current version I am Using the following building blocks: Camera Rig Passthrough Scene Mesh MR Utility Kit Scene Debugger As you can guess I am quite new to Meta Horizon Development. I've been stuck on this problem for quite a while now. Any tips/advice/recources to resolve this are highly appreciated.Solved693Views0likes1CommentUsing the Quest 3 passthrough for Augmented Reality on architectural project
Hi, we’re on the verge of buying a Quest 3 for a specific solution to a VR/AR need. We’ve designed and completed a physical two-storey hotel suite, but the spiral staircase connecting the two floors has been delayed. We have a fully modelled and textured 3d model of the staircase and was hoping to use the passthrough element of the Quest 3 to drop the staircase in position in the existing hotel room and allow potential customers the opportunity to see it while wearing the new Quest. Problem is there’s little to no support on where to even begin with this so can anyone advise on where to begin? Is it possible? Many thanks3.5KViews1like6CommentsCamera motion in passthrough
Hi! I am working on developing an app that rotates or translates the field of view during passthrough at different rates. I was wondering if there was a way to have the cameras on the Meta Quest 2, 3, or Pro track a virtual object that moves back and forth? I want the cameras to move back and forth independent from the user's head motion. I can clip some peripheral vision to make this possible as well. Does anybody have a recommendation on how to do this or think it is even possible with the current plugins? Thanks!495Views0likes0CommentsHow to develop an AR App for meta quest for Object recognition
Hello I am developing an AR app for meta quest pro and meta quest 3. I have already an app for it which works on Android Smartphone phone but the experience not so good. In that app i am using tensorflow on Android and recognizing the object at runtime and over laying some icons based on the recognized the app. I want to (1) Is there anything similar for Meta Quest? Like object recognition and icons over laying (2) Do i have to develop in Android only? (3) Any Unity framework to do so? Looking for soem good developer adv Regard.808Views0likes0CommentsAny future for mixed reality applications on Quest Pro that track real-world objects?
After purchasing a Quest Pro, because I wanted to explore mixed reality applications that track real-world objects, I found out that it is not possible yet because they don't let applications use camera data due to 'privacy concerns'. You can't even track AR tags. Since this is a mixed reality headset it seems this would be an essential functionality and priority for them. People use camera data all the times all the time on phones to create mixed and augmented reality applications. Is this something we will see them eventually release an SDK for (like AR Foundation) or should I just return it? Attaching a separate webcam on my headset is not an option.1.8KViews1like2CommentsDisabling Oculus tracking/rotation
Hey, I'm aware that this question has been asked quite a lot (https://forums.oculus.com/community/discussion/40288/is-it-possible-to-disable-all-tracking, https://forums.oculus.com/community/discussion/comment/242286), but everything i've found has been for Unity, when I'm working with the C++ API (Using CV1, newest SDK). Anyway, I'd ideally like some function (like an ovr_DisableTracking), that i could call on initialization that would stop the Oculus from responding to any gyroscopic, accelerometer, etc, readings. Basically i want to turn the Oculus into a head mounted dumb screen. I've tried a separate thread that calls the ovr_SubmitFrame as frequently as possible, but the occasional frame drop or context switch leads to this making the image occasionally get out of sync with the actual HMD orientation, leading to black bars. Plus I'd prefer to drop an unnecessary thread. The reason i want this is because i want to have a camera mounted directly on the front of the Oculus piping into the screen, making any Oculus orientation readings redundant, since the camera moves with the user's head anyway. Any recommendations or code samples would be fantastic. Thanks.1.1KViews0likes2CommentsReducing delays: AR use case
I've seen number of posts related to using the Rift for Augmented Reality projects, and I have an issue in this area as well. I'm concerned about the end-to-end latency: from the moment a "photon" hits a video camera until it is presented to the user in the Rift. In my measurements I see delays of 100-120 msec, which is way longer than acceptable. When I'm presenting same camera image on the screen, I measure ~50 msec delay (this makes sense for a time it takes to grab an image + transfer it to PC over USB + process it), but I'm trying to reduce the extra 50-70 milliseconds which are being added by Oculus. Any ideas? Is there a way to reduce the length of the Swap Chain created by "ovr_CreateTextureSwapChainDX" to 1 instead of the default 3? Maybe it can help here.1.2KViews0likes4Comments