Forum Discussion
SadraMoghadam
2 years agoHonored Guest
Controller-Driven Hand (Capsense) v66 is broken.
I have been trying to implement Capsense or controller-driven hands for four days, but I have not been successful. Initially, I worked with version 65 of the Meta XR SDK, but nothing seemed to work d...
mcgeezax4
2 years agoProtege
If you are talking about OVRControllerDrivenHands not working in the Unity play mode, then yes that has been an issue since at least v64. I'm not sure if it ever worked since whenever they deprecated OVRControllerHands and replaced with OVRControllerDrivenHands.
Actually it works if you make a full apk build and deploy it to the headset, but not being able to use it in play mode pretty much makes it unusable for a VR app developer. But the fact that it stlil technically works in a build, and that it has been an issue for so long, leads me to believe they don't intend to fix it.
You can drop in the deprecated OVRControllerHands prefab instead and that should work, at least it did when I tried it back in v64, but it uses some older APIs that wont work with some features.
- SadraMoghadam2 years agoHonored Guest
Thank you for the solution. I have previously tried using OVRControllerHands on version 65, and now on version 66, but encountered a different problem. When I move the player, the hands remain stationary in one position. For example, if I move the player to position (-8, 0, 0), the hands will appear at position (8, 0, 0). Do you know the reason for this issue and how I can fix it? I have tried almost everything to resolve this problem but couldn't identify the source.
- ryan-at-melcher2 years agoProtege
I fixed what I believe is the same issue you are seeing SadraMoghadam: The synthetic hands in the OVRControllerHands prefab are positioned using a local-space pose which does not account for the camera rig moving or rotating. This means the hands will "remain stationary" rather than follow the player as expected.
My fix was to make a copy of the HandVisual script, change it's UpdateSkeleton function to use a world-space pose, and use this script in place of the original component.
// WorldSpaceHandVisual.cs // Add a serialized field to reference OVRInteraction's transformer. [SerializeField] private TrackingToWorldTransformerOVR _trackingToWorldTransformer = null; public void UpdateSkeleton() { // ... if (_updateRootPose) { if (_root != null && Hand.GetRootPose(out Pose handRootPose)) { // Original OVR code. This incorrectly uses the local-space pose // to position the hands. The result is the synthetic hands do not // follow the player's actual controller position as the player // moves and rotates. //_root.position = handRootPose.position; //_root.rotation = handRootPose.rotation; // Fixed code. This transforms the local-space pose to world- // space before assigning it to the hands. Pose worldPose = _trackingToWorldTransformer.ToWorldPose(handRootPose); _root.SetPositionAndRotation( worldPose.position, worldPose.rotation ); } } // ... }This solution also required I duplicate the HandVisualEditor script and set it up for the new WorldSpaceHandVisual. (This fix would've been a lot simpler had Oculus made the access level of members in their scripts protected by default!)
Here's a gist of my fixed files: https://gist.github.com/ryan-at-melcher/773d3073329134c43a60110e101f0557
- CafeSingularity2 years agoProtege
In addition to this, modifying "FromOVRControllerHandDataSource" to reference a world-space pose instead of a local-space pose seems to ensure that all interactions with OVRControllerHands work correctly.
- giorgos.ganias2 years agoExplorer
What changed did you make in FromOVRControllerHandDataSource script to actually make it work?
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago
- 2 years ago
- 3 months ago
- 4 months ago