Forum Discussion
JeffNik
4 years agoMVP
Synchronizing virtual/real play space for multiple Quests
Now that I have my head wrapped around how to use the new AR Passthrough API (using Unity), I want to try a "shared" AR experience for two players in the same physical room using stage tracking mode - maybe like hitting a floating ball back and forth (to keep it simple) or maybe a ping pong game: real room, virtual ball and table.
How would you go about making sure that virtual objects appear in the same location for BOTH players in realtime? How would you synchronize the origin and initial orientation on the two Quests?
3 Replies
Replies have been turned off for this discussion
- Anonymous
Seems like one player needs to be the authority on where the origin of the shared world is. You could have player 1 put their controller down on a marker (in a specific orientation) to set the origin and then have player 2 put their controller down on the same marker/orientation to synchronize. Then object positions could be communicated back and forth in the local coordinate system of that synchronization transform.
If you had a 3D printer you could print a "cup holder" for touch controllers that ensured a very specific position and orientation to guarantee close synchronization.
I *DO* have a 3D printer. Cool idea... once I have the quaternion containing the difference transform... I would have to apply the inverse of it to the "non-authoritative" OVRCameraRig parent object, right?
- Anonymous
Yeah, I think that would work. If you offset and rotate player 2's camera rig so that the synchronization transform's world position and rotation matched player 1's, I think you'd have aligned worlds.
(Of course, I'm just closing my eyes and imagining, but it seems like it would work.)
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago