Forum Discussion

Lahesol's avatar
Lahesol
Honored Guest
27 days ago

Finding mentos for supervising about my work

Dear fellow developers.

Please accept my apologies in advance for not posting a specific question.

I am a PhD student in South Korea and am very interested in research on hand gesture classification using the Meta Armband.

I have completed my master’s thesis defence on this topic, covering the development of hardware, an armband, firmware and on-board AI, and am now preparing to submit my paper.

I came across this forum because I was curious to know how these processes work at Meta, whether there are any APIs or SDKs in use, and how human-based refinement is carried out to improve the accuracy of action classification.

I am writing this post because, as part of the coursework, we are required to find at least one mentor, receive mentoring from them, and present on the experience.

Whilst exchanging private messages may not be in keeping with the spirit of this forum, I am looking for a mentor who could offer some brief advice on research life or technical development.

I would be very grateful if you could contact me via DM on my Instagram (@lahesol) or by email at mailto:ywyoo@hanyang.ac.kr.

Best regard,

1 Reply

  • Slayemin's avatar
    Slayemin
    Start Partner

    You'll want to google "Nimble Hand Tracking" to get started. There's a public github repo available here:

    https://github.com/facebookresearch/hand_tracking_toolkit

    High level overview: We use the inside out cameras to capture hand gestures / poses. The cameras operate in stereo which gives hand depth. Finger and hand occlusion is handled by an ML AI which has been trained on tons of poses to get pose estimation. This gives pretty accurate hand pose estimation even when there's incomplete information.

    You can compare the Meta Quest 3 hand tracking against the hand tracking offered by Ultra Leap (formerly leap motion) - They use a mounted pair of IR cameras and an IR emitter to capture hand pose. I played around with their tech back in 2014-2015, so it had some pretty serious drawbacks (ie, can't operate on sunny days or with other environmental IR interference, extra cables, bad occlusion handling, limited FOV, etc).

    You'll probably also want to play around with Unreal Engine 5 and explore hand tracking via the various plugins. I think there's a meta SDK offering plugin support, and there's also been a plugin supporting Ultra Leap (by Getnamo, if he's still around). You can look at the plugin source code in more depth to see how basic gestures were implemented. I'd treat it as a starting point for adding your own custom hand gesture support.

→ Find helpful resources to begin your development journey in Getting Started

→ Get the latest information about HorizonOS development in News & Announcements.

→ Access Start program mentor videos and share knowledge, tutorials, and videos in Community Resources.

→ Get support or provide help in Questions & Discussions.

→ Show off your work in What I’m Building to get feedback and find playtesters.

→ Looking for documentation?  Developer Docs

→ Looking for account support?  Support Center

→ Looking for the previous forum?  Forum Archive

→ Looking to join the Start program? Apply here.

 

Recent Discussions