Quest 3 how to match virtual and physical room orientation properly (AR/XR) ?
Hello, I use a quest 3 in UE5 using steamvr currently (new to quest dev in ue). I use Steam link to stream the picture to the quest3. I never packed an apk but simply started the game from the ue editor in vr mode in unreal. I recognized, that every time i start/stop the game in the editor, the orientation (yaw) of my virtual space in unreal changes kind of randomly... probably depends on inital headset position when i start the game. I want to place a virtual object in my 3d scene and i want it to correspont to the real-world location for ever - even when i shutdown unreal and the quest headset and restart. Think of an AR way to place a virtual object in your room in a specific position. I already found the ARPins, but couldnt get it to run (at least not when starting the game from the UE editor in vr mode - they seem to be overpowered for my case anyway). Generally i wonder why it is so hard to match virtual orientation to real-world-orientation. The guardian/champerone, is always perfectly matching the room - even when turjing off the headset. So the headset must be aware of the physical room and positions and orientation. Why is it such a hustle to match it in unreal? Would be glad if someone could shed some light 🙂 thank you 🙂Solved6.3KViews0likes2Comments[Feature Request] Visual acuity filter for meta quest 3 ???
Hello everyone and especially the people who are in charge of the technical development of the Quest 3, We're developing a non-commercial application for visually impaired people. The goal is to obtain the image representation in the quest 3 as the real image of the visually impaired person. Through this application that person can show and and let his employer experience what his real visual limitations are. The app is almost done but there is one big hurdle we can't cross. We can't figure out how we can create a visual acuity filter for the quest 3, one that is adjustable to the degree of the person in question. We don't have access to camera software or have the possibility to do in post-processing. We desperately need some help and advice. This application has a noble purpose, your help will be a great help to thousands of visually impaired people around the word !!! Many thanks in advance, Bart.🙏1.6KViews4likes3CommentsOculus Integration 31.2 Passthrough works, broken when i upgrade to v33?
Hi! My program, Custom Home Mapper, uses pass-through features. I released an update when v31.2 was available and integrated AR features into a dozen minigames, really opened up the usability and functionality of my app. I'm worried, because passthrough ONLY works on Unity Oculus Integration v31.2. Had to jump through a lot of hoops to get it working, but I did it (yay?). Only now, with the rollout of firmware v34, nobody can use the passthrough features i wrote in version 31.2. (boo) I know the SDK for v34 isn't out yet (tommorow, right?) but does anyone know what I need to change to make passthough functions on v32 or v33? We still require the OpenXR backend and IL2CPP, correct? I don't think there are many devs doing big passthough projects like me, so i'm kinda stuck trying to find resources and info about how this sdk works. Makes me nervous that i cant run things on the newer integration packages, no errors, just passthrough doesn't show. Curious to hear from devs runnning passthough on oculus integration v32 or v33.3.1KViews1like5CommentsGetting raw depth data from Quest 2
Hi there! I'm working on an AR project and I'm trying to retrive the depth data from sensor (dots updated on surface of objects inside a guardian).... Where can I access that? I'm using Unreal Engine for development on Quest 2. Big thanks to anyone who can answer this question!1.5KViews0likes1Comment