cancel
Showing results for 
Search instead for 
Did you mean: 

HOW DO YOU SET UP MIXED REALITY WHERE PLAYERS ARE IN THE SAME PLACE?

dredgeho2
Protege

quest 2/3, unreal engine 5.4, blueprints, listen server

I have my game setup with a listen server. All players can join game. In mixed reality the players headset and controllers should be on top of the actual person in real life but they are not. In fact the headsets are rotated backwards (I had to fix this- and it doesn't do this in VR). The players do not see the same things in the game. 

I've asked support and they just send a drop down, copy/paste reply which makes no sense.  It's like they don't understand what I'm asking for or their too lazy to give a response so they use the drop-down then close the support and move on to whatever they do in wherever they are.

MIXED REALITY... you are in the same room with people and playing a game.  You can't play a game because you see the room differently and the players are not in the right positions.  And nobody replies with any answers and the support is not support and the examples don't work and won't even open in unreal engine.  

7 REPLIES 7

jtriveri
Adventurer

You need to use shared spatial anchors. Headsets will not magically know the positions of other headsets. You can use shared anchors to calibrate headsets in the same space. Read the docs. They are your best friend!

https://developers.meta.com/horizon/documentation/unreal/unreal-spatial-anchors-sharing

I never said magically.  Typical response on this forum.  You ask for help and people say "it's not magical dude you just need to read the docs"  THE DOCS ARE GARBAGE. If the docs were so good and "my friend" then I wouldn't even be here asking this question... AGAIN and again.... and still don't have an answer. 

And I've read that that webpage link multiple times and it does not tell you how to set it up.  It's basic general info that doesn't help. And to top it off the shared anchors example for unreal engine doesn't even compile or even open in unreal engine. 

Support here is a joke. 

"Read the docs. They are your best friend!"  Yeah that isn't true at all. The docs are awful and don't give examples that work. I had one document on here where the images (example code) were tiny jpgs that you couldn't even read.  

And another example... where they want you to sign up to Photon just to see the example code.  I don't need to sign up to photon for my project. It's all done as a listen server and people can already connect to each other and see each other. Problem is they are in the wrong place in the room and objects are not synced.  And that link you gave doesn't fix this or explain it at all. 

 

"You can use shared anchors to calibrate headsets in the same space." 

OK so... how??  HOW! Again no examples.  When I help people I show screenshots of my WORKING code so others can see how it works. People here are so secretive... it's either they don't want to help or pretend to help and not know what they are talking about.

innit
Expert Protege

Mix reality stuff is very much in development. Meta is mostly Unity engineers and Epic has minimal resources for VR, the Meta Plugin is still on 5.4.4, maybe we'll see updates soon.

The docs there have sample code for sharing and downloading anchors. They do tell you how to set up anchors. Once a headset creates or downloads an anchor you can reposition the XR Rig so that the anchor position and rotation represents the origin of the world (you can think of moving the XR rig as the inverse of moving the virtual world relative to the physical world). That way you can match the positions and rotations of all the headsets’ world origins with the same physical orientation. 
I have no screenshots of working code because I work on a different platform and it wouldn’t be helpful to you. I’m just trying to point you in the right direction. People here are not lazy, they just don’t have a lot of time. I can point you in the right direction and answer questions but I can’t set this up for you. 

puzzabug
Protege

Hey! Frustration understood. It's the most basic building block of the future of most games, and it's very obfuscated and complex to implement. I'm trying to set it up this week, if you'd like to work on it together, find me on discord!

I'm assuming Unreal doesn't get the 'building block' prefabs that the Unity SDK comes with. I really hope you guys in unreal land get better support going forward : (

puzzabug
Protege

Reading the unity documentation helped a lot with setting things up in unreal, and I don't use unity. Last oculus dev conference we had been given hopes of documentation, which has still been a bit sketchy. It's hard to keep track of things and very hard for anyone just getting started. I'll be plowing through the process this week looking for that no-click room space sync magic, and if I'm lucky, I'll share my success.