02-27-2024 06:33 AM
Hello everyone,
I've been using TryGetBoundaryPoints in the process of calibrating multiple headsets on the same physical space without any external trackers.
I'm trying to find some documentation on the points we get from the function, where are they calculated, by whom (Unity, OpenXR, Meta) and if they are sorted in a way or not. I sadly can't find any documentation, the only thing I know is that OpenXR didn't supported it 2 years prior and it feels like it is still the same.
Based on my tests, I would have said that those four points are calculated from a "biggest inner rectangle" algorithm where the last step would leave 4 points with random orientation. Am I wrong to think that the rotation is random ? Why not use OpenXR XrExtent2Df width and height property to align the longest boundary wall to one of those two properties so that two kinda alike boundaries would have either a close to 0° difference or close to 180° difference (or is it managed by OpenXR?).
On top of that I would like to know if there's any plan in getting the "GetGeometry" like to function to obtain the ful 256 points instead of the playarea (is it as well managed by OpenXR) ?
Thank you every for your time!
Dan