cancel
Showing results for 
Search instead for 
Did you mean: 

Quest 3 how to match virtual and physical room orientation properly (AR/XR) ?

Shenksu
Explorer

Hello,

I use a quest 3 in UE5 using steamvr currently (new to quest dev in ue). I use Steam  link to stream the picture to the quest3. I never packed an apk but simply started the game from the ue editor in vr mode in unreal.

I recognized, that every time i start/stop the game in the editor, the orientation (yaw) of my virtual space in unreal changes kind of randomly... probably depends on inital headset position when i start the game. 

I want to place a virtual object in my 3d scene and i want it to correspont to the real-world location for ever - even when i shutdown unreal and the quest headset and restart. Think of an AR way to place a virtual object in your room in a specific position. 

I already found the ARPins, but couldnt get it to run (at least not when starting the game from the UE editor in vr mode - they seem to be overpowered for my case anyway). Generally i wonder why it is so hard to match virtual orientation to real-world-orientation. The guardian/champerone, is always perfectly matching the room - even when turjing off the headset. So the headset must be aware of the physical room and positions and orientation. Why is it such a hustle to match it in unreal? Would be glad if someone could shed some light 🙂

thank you 🙂

1 ACCEPTED SOLUTION

Accepted Solutions

Shenksu
Explorer
I got it to work finally. Ill sum it up real quick for others cause it was a bigger hustle for me, specially for those who are new. The following describes  how i got spatial anchors to run over air link directly in UE5.3.2 editor and since we are using directly the metaxr plugin in ue, we can read all points from guardian using this node "get guardian points". So guess spatial anchors is the way to go, but in theory one could also access the guardian to fix virtual object in real space :). Seems MetaXR plugin is the way to go when its about quest dev <3. sooo much more convenien than openxrt! Here is how to do it:
 
 
1. I installed the metaxr plugin from the meta website for ue5.3.2 (can also be done via epic marketplace, search for "metaxr"). Maybe follow short install instructions on meta page: https://developer.oculus.com/downloads/package/unreal-engine-5-integration/60.0
 
2. Setup developer account at meta and bring the quest 3 into developer mode afterwards (like people do when sideloading stuff over adb)
 
3. Log out from oculus software on pc and re-login (needed to do it to make it recognize that iam a dev now)
 
4. Within the occulus software on pc: Switch "OpenXR-Runtime" to use the Oculus runtime
 
4. Also in occulus software: Go to "Preferences" -> "Beta" -> Switch on "Runtime Features for Devs". I checked "Passthrough via Oculus Link" and "Pointcloud via occulus link" (dunno how the english terms are. I use german language in app) - options will only appear once the software recognizes that you are dev. This is why i re-logged in.
 
5. In unreal i disabled openxr, restarted and enabled metaxr
 
6. In unreal under "Project settings" under "Plugins" open "Meta XR" i enabled "Anchor Support" and "Anchor Sharing"
 
as you mentioned and works like a charm - also when running the VR project in UE Editor with Quest 3 being connected via Air Link ... So no need to build APK yay! 🙂
 

View solution in original post

2 REPLIES 2

Shenksu
Explorer
I got it to work finally. Ill sum it up real quick for others cause it was a bigger hustle for me, specially for those who are new. The following describes  how i got spatial anchors to run over air link directly in UE5.3.2 editor and since we are using directly the metaxr plugin in ue, we can read all points from guardian using this node "get guardian points". So guess spatial anchors is the way to go, but in theory one could also access the guardian to fix virtual object in real space :). Seems MetaXR plugin is the way to go when its about quest dev <3. sooo much more convenien than openxrt! Here is how to do it:
 
 
1. I installed the metaxr plugin from the meta website for ue5.3.2 (can also be done via epic marketplace, search for "metaxr"). Maybe follow short install instructions on meta page: https://developer.oculus.com/downloads/package/unreal-engine-5-integration/60.0
 
2. Setup developer account at meta and bring the quest 3 into developer mode afterwards (like people do when sideloading stuff over adb)
 
3. Log out from oculus software on pc and re-login (needed to do it to make it recognize that iam a dev now)
 
4. Within the occulus software on pc: Switch "OpenXR-Runtime" to use the Oculus runtime
 
4. Also in occulus software: Go to "Preferences" -> "Beta" -> Switch on "Runtime Features for Devs". I checked "Passthrough via Oculus Link" and "Pointcloud via occulus link" (dunno how the english terms are. I use german language in app) - options will only appear once the software recognizes that you are dev. This is why i re-logged in.
 
5. In unreal i disabled openxr, restarted and enabled metaxr
 
6. In unreal under "Project settings" under "Plugins" open "Meta XR" i enabled "Anchor Support" and "Anchor Sharing"
 
as you mentioned and works like a charm - also when running the VR project in UE Editor with Quest 3 being connected via Air Link ... So no need to build APK yay! 🙂
 

MetaDevXR
Protege

I am running into a similar issue in Unity.  Meta docs are nearly useless for anything past surface understanding.  I have a training lesson app where we want both freedom to walk around at certain times but also have specific lesson "hot spots" where we teleport the learner at certain lesson intervals.  I am using the Meta all-in-one SDK.  OVRManager tracking origin type = Floor Level. 

I've noticed if I stay on the start position or re-center and then teleport the player at runtime via code, all teleportation is accurate.  But if I walk a few feet from origin then all teleporting locations seem to be offset by the distance I am away from start orientation.  What's even weirder is that the player rig that the camera is attached to will teleport correctly, but the eye camera itself stays locked to the relative world position.  This makes a certain kind of sense as the device is physically in a world position, but I fail to understand why any relative deviation from origin isn't automatically added to a teleportation. 

I had to do multiple builds and tests to figure this out.  Maybe I have a setting off/on somewhere - i don't know.  But I know there's got to be a simple way around it.  Likely some code setting the camera offset or auto recentering via code but it is not obvious the correct way to do it using the SDK.  If there isn't a way to easily do this my fall back is to just move the world (which is very small) around the player.