I set rigidbody,box collider,hand grab
interactable,grabbable,interactable group view and interactable debug
visual to a cube. When my hand reaches the cube and grab it, the color
of the cube changes, meaning that the cube has detected grabbing
behav...
The default OVR setting in oculus integration SDK only supports index
finger. I want to enable the poking behaviour of middle finger and the
poked object will have different behaviours after receiving these two
actions. For example, when poking the v...
Oculus quest 2, using oculus link to play in editor. I changed the adb
path of oculus developer hub and everything **bleep**ed up. If I click
the play button, oculus interface shows "Loading" and get stucked in
loading status. Even if I cancelled the...
I want to add a screen(actually cube) im the scene, and when I hand
pokes the screen, it will store the poked coordinate of its local system
in a variable.How to get the coordinate when it is poked?I can't find
any API to do this.Background:Gonna to ...
I applied rigidbody withoud kinematic to hand visual and objects, and I
also modified the script HandPhysicsCapsules to change the kinematic of
the rigidbody component of generated capsule colliders to false.
However, when I ran the demo, hands thems...
I have the same problem. But I have turned off the Airlink and
refactoried the win10 system. My laptop can detect the device and read
its files, and oculus developer hub works well. However, the oculus app
can't detect the device. When I plugged in t...
I can't fully understand this, since many demos supports detection the
poking behaviour of all five fingers, although they can't have different
behaviors according to specified fingers.