We're developing apps that work in large real spaces with the guardian system disabled. As such we need to perfectly align the user in physical and real space. To do that, we're using a controller as the calibration point. This requires the user to place the controller on a spot in the real world that is mapped to the virtual world. however, for perfect calibration we require perfect rotation so that the in-game pawn & world are rotated properly to match the real world exactly.
We're doing this with 1 point right now, but we may switch up this method if we can't solve the rotation of the controller.
The rift controllers when laid flat, calibrate fairly well. An arrow attach to them at 90 degrees comes out of the rift controller straight and when laid on the floor lines up fairly well with the floor. The quest controllers however are very different. Due to the difference in shape, any arrows coming off the controller end up pointing into the floor at about a 45 degree angle, and from all observations it looks like the quest controller orientation is slightly different from the rift. An arrow rotated 90 degrees does not come out of the quest straight, but seems to be tilted by at least 8 degrees from rough measurements.
Is anyone aware of any documentation from oculus that documents the digital transform of the controller vs the real world orientation?
As an alternative we're looking at calibrating based on both the right and left controllers being placed on calibration spots, but that would be a back-up choice, we'd prefer to calibrate based on a single controller.
Hi, did find a solution? I would like to go for a different approach, using handtracking only. I was thinking about shooting a ray and get a hit with the floor plane where my real world markers are. By making 3 hits on different location you could make a perfekt alignment. There must be a way to get the floorplane, as it is similar to guardian creation.
Yes, unfortunately the company I work for decided to apply for a patent on the method we're using so I can't detail it.
However, yours is sufficiently different from ours, so I can comment on it.
The only problem you have with your method is making sure the user is pointing their hands directly at the real world marker when they point at it to create calibration. I can make this suggestion for you: The best way to get a perfect alignment is by ensuring that you know exactly where the player is in the real world and how they're oriented.