I'm on the hunt for code (C, C++ or anything that looks like C) that will allow my app to find the intersection of the Touch controller beams against any cylinder projection "screens" created in the app. Pretty much what Oculus Home uses to allow the user to click on buttons or other UI controls in the large, main screen in it.
I just want to make something clear: it is very tempting to say "hey, that is just a ray-cylinder intersection algorithm - google for that". That's not entirely true - the Oculus Mobile SDK does not really expose general cylinder construction parameters that we could use in generic intersection algorithms (even though there is a very intriguing matrix inside ovrLayerCylinder2 that I always suspected might be the key to solve the issue).
This is why I'm currently looking for a solution which I can send the ovrTracking data of a controller, the ovrLayerCylinder2 data of the screen, plus whatever else is needed, and then it returns the (x, y) coordinates within the screen that represent the intersection point.
Do you happen to have any code that does something like that?