cancel
Showing results for 
Search instead for 
Did you mean: 

[Hand Tracking] New Oculus Start Button

DetectiveBunss
Honored Guest
Hello,

With the last update 17, i've noticed that there's now the start button showing within the left hand, like the right hand do with the oculus button.

I've also noticed that Waltz of The Wizards use this button to trigger a Menu Settings.

My question is : Which script is managing this new button ? I searched OVRHand.cs but didn't found anything. 
I also tried the OVRInput like i would do if i want to code the start button with the Touch Controller but it didn't seem to work.

I want to use this feauture to create a Menu for my game, but didn't find any documentation about it so far.

Do someone have found where he is generated in Unity and how we can customize it ?

Have a good day everyone
bdm7i2yobvk5.png
24 REPLIES 24

Anonymous
Not applicable

 

public class HandStartButton : MonoBehaviour
{
    // Update is called once per frame
    void Update()
    {
        if (OVRInput.GetDown(OVRInput.Button.Start, OVRInput.Controller.Hands)) {
            Debug.LogWarning("Hand-Tracking Start Button DOWN");
        }

        if (OVRInput.GetUp(OVRInput.Button.Start, OVRInput.Controller.Hands)) {
            Debug.LogWarning("Hand-Tracking Start Button UP");
        }
    }
}

 


Reading the state of the hand tracking "start button" gesture does appear to be broken in Oculus Integration v35, using the OpenXR runtime.

Hi there, do you know if this is still broken? I'm having the same issue with the hand tracking left hand start button pinch and hold gesture not working. I'm not getting any errors in logcat and have tried a couple different things. Thanks!

LachoT
Explorer

Looks like this is not working even in the sample scenes (version 38), which makes me think this might be due to some setting in the project that's causing it to break. But so far I haven't been able to find it.

Mayumichi
Explorer

Seems to be broken in v39 as well. Unity 2020.3.31f1 if that matters.

muskson
Honored Guest

Same here - 'Start Button' doesn't work with hand tracking, even though the same code works with regular controllers.

 

Even checking for 'Any Button' on 'Any Controller/Hand' doesn't return true when completing the pinch gesture with the left-hand and the loading circle animation completes.

 

Using Unity 2021.3.2f1 and Oculus Integration v39.

adam_breachvr
Honored Guest

Same here. Are there any chances of this ever getting fixed? 🤔

Honestly if it is still an Issue, you might as well just program your own gestures to do it. Here is a good tutorial on how to do it. Hope this helps.

https://www.youtube.com/watch?v=lBzwUKQ3tbw

If you know of a way to simulate the gesture to trigger when the system gesture triggers I'm all ears. We have no idea what kind of time and distance thresholds, smoothing etc they're using so it's very hard to get it to match exactly. An imperfect workaround for sure, but why isn't this still fixed? Just updated to integration v0.43 and the issue is still there. Or is there a different working way of reading the state aside from what's in this thread?

muskson
Honored Guest

I've sent a bug report to Oculus, and after 2 months of "thank you for your patience, we're still working on it!" they said they won't fix it. No explanation why, or if they're even going to work on it in the future, or anything. They just closed the ticket and that's that.

 

So I guess they just don't care? The whole situation is baffling to me - it seems like a very simple issue to fix or at least give us a good workaround, but instead they've been ignoring it for almost two and a half years.

What are you doing that you need the system gesture triggers? You can set the threshold, and time if you'd like. I am curious.