cancel
Showing results for 
Search instead for 
Did you mean: 

Calibration of multiple VR Headset

Ackermanngue
Explorer

Multi-User VR Application Using Meta Quest and FMETP - Anchor Synchronization Issue

Hello everyone,

I'm working on a multi-user educational VR application designed to offer an immersive experience where several players interact together in virtual reality to solve puzzles.

Unfortunately, due to confidentiality reasons, I can't share the specific content of the application. I picked up this project from a previous developer, so I'm still getting familiar with some parts of the code. While I have experience developing AR/VR applications, most of my past projects were offline single-player applications. This is my first networked VR application, and I'm encountering some issues related to spatial anchor synchronization.

Application Overview:

The application consists of two parts (both grouped within the same Unity project):

1 Server Application (Windows)

Used by the game master to manage:

  • Scenario selection and launch.
  • Data synchronization between VR headsets.
  • Tracking player progress during puzzles.

2 Client Application (Meta Quest)

Used by the participants, who:

  • Connect to the server via the local network.
  • Receive information about the scenario and puzzles.
  • Interact in a shared virtual environment.

Networking Technology:

The application uses FMETP (Fast Message Exchange Transmission Protocol) to manage network communication between the server and the headsets. FMETP enables bidirectional RPC (Remote Procedure Call) communications.

Both the server and the VR clients are connected to the same local network.

Workflow:

Here’s how the application is expected to work:

  1. The game master launches the server application (on Windows).
  2. Participants launch the client application on their Meta Quest headsets.
  3. The clients connect to the server and enter a lobby.
  4. Once everyone is connected and ready, the game master starts the scenario.

Important Detail:

While in the lobby, participants see an arrow on the floor to help them orient themselves correctly. This ensures that the scenario world is aligned consistently across all headsets.

Problem: Spatial Anchor Synchronization

The project relies on spatial anchors provided by the Meta Quest headsets to align the coordinate systems between players. Specifically, we use the "FLOOR" anchor created by the OVRSceneManager to establish a common reference point in the room.

However, there's a significant physical offset between headsets.

Even after scanning the room accurately and in the same way on each headset, there's a physical offset between 20 to 50 cm between players in the virtual environment. This is clearly unacceptable for a shared VR experience.

What I’ve Tried:

To solve this issue, I’ve implemented the following workflow:

  1. When a client connects to the server, the server asks the client to send its anchor information (position and rotation of the "FLOOR" anchor).
  2. The server broadcasts this anchor information to all connected clients.
  3. Each client applies the received anchor information to adjust its position and rotation.

Despite this, the offset still persists. The players don’t share exactly the same coordinate space, and there’s always a noticeable gap between their physical and virtual positions.

My Hypothesis:

I’ve considered whether it’s possible to share the entire room scan between the headsets, but I haven’t found a way to do this using the OVR Plugin. It seems like the headsets create their own spatial map independently, making it difficult to ensure a consistent coordinate space across devices.

Recap of Technical Details:

  • VR technology: Meta Quest (Oculus) with OVR Plugin.
  • Network: FMETP (Fast Message Exchange Transmission Protocol).
  • Multi-user context: VR headsets in the same room, connected to the same local network. 
  • Main objective: Synchronize anchors so that all headsets share the same virtual coordinate space.

Questions:

  1. Is there a better way to synchronize spatial anchors across headsets using the OVR Plugin?
  2. Can spatial maps (room scans) be shared between Meta Quest devices to ensure a consistent coordinate system?
  3. Am I handling anchor synchronization correctly, or are there improvements I should make in my workflow?

Thanks to everyone who will read this and give me any type of answer ! If you need some details, if it is not private to the project, feel free to ask, I'll answer you as soon as possible.



1 ACCEPTED SOLUTION

Accepted Solutions

Ackermanngue
Explorer

Hello,

I found a practical workaround that significantly reduced the physical offset between headsets. While it may not be a perfect solution, it worked well enough for my use case, bringing the offset down to around 4-5 cm, which is acceptable for my project.
Since Meta Quest headsets rely on independent spatial maps, each headset creates its own coordinate space after scanning the room. Here's how I handled this issue:

Guardian / Room scan Setup:

  • I set each headset to use the "Stage" tracking mode to ensure that all devices start with a consistent floor-level reference.
  • I then performed room-scale Guardian and a room scan on each headset, making sure to replicate the scan and the Guardian as accurately as possible on each device.

Rotation Offset Calculation:

  • After scanning, I chose one headset as the reference device.
  • I calculated the difference in rotation between the reference headset and the other headsets (they all need to look at the same direction in the real world).
  • I applied this rotation offset to the world to align the coordinate systems between the devices.

Unfortunately, I couldn't find a way to share the spatial maps or room scans directly between headsets using the OVR Plugin. It seems that the current Meta Quest system doesn't support this kind of functionality natively. Therefore, the solution requires manual Guardian setup and calculation of a rotation offset.

Anyway this workaround isn't a perfect fix, but it drastically improved the alignment between headsets in my multi-user VR experience. I hope it helps you or at least gives you an idea of how to tackle this issue even if it is not a "clean" way to do.

If anyone knows a better way to do, I'll be glad to read it !

View solution in original post

3 REPLIES 3

Ackermanngue
Explorer

Hello,

I found a practical workaround that significantly reduced the physical offset between headsets. While it may not be a perfect solution, it worked well enough for my use case, bringing the offset down to around 4-5 cm, which is acceptable for my project.
Since Meta Quest headsets rely on independent spatial maps, each headset creates its own coordinate space after scanning the room. Here's how I handled this issue:

Guardian / Room scan Setup:

  • I set each headset to use the "Stage" tracking mode to ensure that all devices start with a consistent floor-level reference.
  • I then performed room-scale Guardian and a room scan on each headset, making sure to replicate the scan and the Guardian as accurately as possible on each device.

Rotation Offset Calculation:

  • After scanning, I chose one headset as the reference device.
  • I calculated the difference in rotation between the reference headset and the other headsets (they all need to look at the same direction in the real world).
  • I applied this rotation offset to the world to align the coordinate systems between the devices.

Unfortunately, I couldn't find a way to share the spatial maps or room scans directly between headsets using the OVR Plugin. It seems that the current Meta Quest system doesn't support this kind of functionality natively. Therefore, the solution requires manual Guardian setup and calculation of a rotation offset.

Anyway this workaround isn't a perfect fix, but it drastically improved the alignment between headsets in my multi-user VR experience. I hope it helps you or at least gives you an idea of how to tackle this issue even if it is not a "clean" way to do.

If anyone knows a better way to do, I'll be glad to read it !

Funnily enough I took a completely different approach and also arrived at a 4-5cm offset.
What I do is have someone create spatial anchors manually on each headset, ensuring all anchors are in the same order in the same physical positions. I then parent all networkobjects to whatever anchor is closest, so that they appear in roughly the same location and rotation on all headsets. If an object is moved closer to a different anchor, I automatically reparent it.

I make it sound simple now but this took weeks to iron out all the kinks.

Hello,

I just read your answer and it's a great workaround in my opinion. I would have never thought about it. Thanks for letting me know, I'll try this with my other projects to see what can be the best.

I can trust you at 100% about the "it makes it sound easy" because I tryhard my solution 8 hours a day every day in a week 😂