Using the Quest 3 passthrough for Augmented Reality on architectural project
Hi, we’re on the verge of buying a Quest 3 for a specific solution to a VR/AR need. We’ve designed and completed a physical two-storey hotel suite, but the spiral staircase connecting the two floors has been delayed. We have a fully modelled and textured 3d model of the staircase and was hoping to use the passthrough element of the Quest 3 to drop the staircase in position in the existing hotel room and allow potential customers the opportunity to see it while wearing the new Quest. Problem is there’s little to no support on where to even begin with this so can anyone advise on where to begin? Is it possible? Many thanks3.2KViews1like6CommentsQuest 3 how to match virtual and physical room orientation properly (AR/XR) ?
Hello, I use a quest 3 in UE5 using steamvr currently (new to quest dev in ue). I use Steam link to stream the picture to the quest3. I never packed an apk but simply started the game from the ue editor in vr mode in unreal. I recognized, that every time i start/stop the game in the editor, the orientation (yaw) of my virtual space in unreal changes kind of randomly... probably depends on inital headset position when i start the game. I want to place a virtual object in my 3d scene and i want it to correspont to the real-world location for ever - even when i shutdown unreal and the quest headset and restart. Think of an AR way to place a virtual object in your room in a specific position. I already found the ARPins, but couldnt get it to run (at least not when starting the game from the UE editor in vr mode - they seem to be overpowered for my case anyway). Generally i wonder why it is so hard to match virtual orientation to real-world-orientation. The guardian/champerone, is always perfectly matching the room - even when turjing off the headset. So the headset must be aware of the physical room and positions and orientation. Why is it such a hustle to match it in unreal? Would be glad if someone could shed some light 🙂 thank you 🙂Solved5.9KViews0likes2Comments[Feature Request] Visual acuity filter for meta quest 3 ???
Hello everyone and especially the people who are in charge of the technical development of the Quest 3, We're developing a non-commercial application for visually impaired people. The goal is to obtain the image representation in the quest 3 as the real image of the visually impaired person. Through this application that person can show and and let his employer experience what his real visual limitations are. The app is almost done but there is one big hurdle we can't cross. We can't figure out how we can create a visual acuity filter for the quest 3, one that is adjustable to the degree of the person in question. We don't have access to camera software or have the possibility to do in post-processing. We desperately need some help and advice. This application has a noble purpose, your help will be a great help to thousands of visually impaired people around the word !!! Many thanks in advance, Bart.🙏1.6KViews4likes3CommentsWebXR problem: Oculus Quest 2, device detects VR project as AR project
I am using WebXR to develop a VR experience that allows the user to walk into a room and interact with it. However, the glasses detect the application as if it were augmented reality and therefore, The Oculus Quest 2 changes to AR mode in the process, making the application unusable. The project is being developed in Unity 2021.3.2 and using the Simple-WebXR package for Web deployment, along with the Mixed Reality Toolkit (MRTK). It has to be an Oculus Quest 2 problem, because it also affects similar projects I developed with WebXR, older projects worked correctly, and they break too. I experimented this problem in all of them. I will show you images of how the application it should look like in VR and another one that shows the AR mode error. My Oculus Quest 2 Version is: 47421700667400000 Is there a mode switch for VR/AR that I'm missing? Is it related to the most recent update or the version of the oculus quest 2 that I have? I need urgent help with this. How it should look like on VR AR mode detected (Error)1.5KViews0likes1CommentI am having limitations with the Meta Depth API Shader for AR
If I want to use any shader affect on an object in AR I can't use the Depth Shader and then you see the objects through my arm. What is the best way to combine two custom shaders? I want to use another shader that creates an invisible material that still receives shadows. I will probably need to combine multiple shaders with the depth shader one in the long run. Here is the link to the depth shader: https://github.com/oculus-samples/Unity-DepthAPI/tree/main955Views0likes1Comment"Space Setup" Issues with v57 Quest developing with Unity OpenXR AR Foundation.
Hi! I'm going to outline my issue with the new Space Setup in my AR project. I noticed that neither the v57 Meta Quest Update blog post nor the release notes mentioned that "Room Setup" was moved from being an experimental feature to a supported one called "Space Setup." Here's Dilmer's video showing Room Setup as an Experimental feature. (This was posted July 13th, so pre v56, but I just wanted to show how it was previously marked as Experimental. v56 also has no note of moving this feature from Experimental - yet it has changed since then) My Unity AR passthrough project uses Unity's AR Foundation's Plane Detection and the data from Space Setup to place objects on top of the generated planes. Example video of my project previously working Upon the v57 update the Plane Detection in my project stopped working due to (I'm assuming) not receiving the Space Setup data due to some blockage of permissions or improper setup. I re-configured my Space Setup, Cleared Space Setup and then reconfigured again. Restarted my Quest Pro and Computer to no avail. I also went into settings and allowed permissions access to Spatial Data for a previous Unity OpenXR project, but there was no option for my current one: this is what I suspect is the crux of the issue. Previously I didn't have to explicity give permission to use the data by my Unity project build. My current project isn't showing up as an option to select (so I'm a bit stuck there), and I don't see a record of this change or others with this issue, so I'm unsure how to solve it. Thank you for reading my long winded essay haha - I would appreciate any help... Let me know if I have overlooked something: maybe another permissions setting etc.Solved5.6KViews4likes6CommentsOcclude virtual object by the real world. Meta quest pro Depth sensor.
Hello, it's possible in Unity with Meta Quest Pro occlude the virtual object by the real world. All this thanks to the passtrough? I'm using Scene Model feature of Meta Quest Pro but I'm having an issue. I place a plane over every wall but virtual object are not occluded by the real world, in this way I can't see real world and I lost all the immersivity. It's possible occlude the virtual object and place it behind real object? I want my virtual object are always behind? I want reconstruct the wall of my room virtually All this with passtrough active.992Views3likes0CommentsAnnouncing the XR Accessibility Project on Github
Hi Oculus developers. The XR Access Initiative and XR Association have created a new resource for XR developers: the XR Accessibility Project! This Github repository is intended as a place for developers to find helpful guidelines and code snippets to create accessible XR applications. We're hoping that the XR Accessibility Project is able to grow over time, as more developers utilize it and submit the resources they use. We encourage you to take a look, and if you think of something that would be useful for other developers, submit it using the contribution form. You can also submit other comments by creating an issue on Github. We know it's hard enough to develop for XR platforms, let alone to do so accessibly; but we hope this resource will help make sure everyone, regardless of ability, can enjoy the great applications you're making. Best, Dylan Fox Coordination and Engagement Team Lead, XR Access2.5KViews1like2CommentsOculus Integration 31.2 Passthrough works, broken when i upgrade to v33?
Hi! My program, Custom Home Mapper, uses pass-through features. I released an update when v31.2 was available and integrated AR features into a dozen minigames, really opened up the usability and functionality of my app. I'm worried, because passthrough ONLY works on Unity Oculus Integration v31.2. Had to jump through a lot of hoops to get it working, but I did it (yay?). Only now, with the rollout of firmware v34, nobody can use the passthrough features i wrote in version 31.2. (boo) I know the SDK for v34 isn't out yet (tommorow, right?) but does anyone know what I need to change to make passthough functions on v32 or v33? We still require the OpenXR backend and IL2CPP, correct? I don't think there are many devs doing big passthough projects like me, so i'm kinda stuck trying to find resources and info about how this sdk works. Makes me nervous that i cant run things on the newer integration packages, no errors, just passthrough doesn't show. Curious to hear from devs runnning passthough on oculus integration v32 or v33.3KViews1like5CommentsAR Passthrough sample scenes in Unity
Hey, the new sample scenes with passthrough AR do not seem to work on my Quest 2. I've built them from a clean project on unity 2020.3.4f1 but I just can't get the passthrough overlay/underlay on my build. Any help appreciated. Did yall manage to run this on yours Quest 2? There's not much info on the web rn on this topic, because the update with AR passthrough api came out two days ago and is available on Unity Asset store through Oculus integration pack2.6KViews0likes1Comment