The default OVR setting in oculus integration SDK only supports index finger. I want to enable the poking behaviour of middle finger and the poked object will have different behaviours after receiving these two actions. For example, when poking the virtual screen(a thin cube) using the index finger, the program will tell a remote PC to impliment left mouse button clicking, and when poking the screen using the middle finger, the program will tell the remote PC to simulate the cliking of the right mouse button. I want to make a remote desktop.
This may not be possible unless you used grip and trigger, trigger being left and grip being right. Cause the oculus quest 2 only has 3 trackings, one for the thumb, one for the index, and one for your whole bottom of the hand since grip is your middle finger.