Hands only not working on Unity
On Unity 6, with the V85 of Meta SDK, I choose in "Hand Tracking Support" the "Hands only" mode so controllers aren't supposed to be available in build. However, controllers are still available in my app so I tried to use the "Controller Only" mode so it disable correctly the hand tracking in my app and it shows up a window to use controllers when I try to lauch my app. Why it's not making the same for hand tracking mode ?🙌 Workshop | Practical Hand Tracking in Unity
Are you ready to level up your app's immersion? Join Start Mentor Valem for a hands-on workshop building on last month's advanced hand tracking session, where you will work directly in a Unity project, implementing finger pinch, microgestures, and hand pose recognition. By the end of this session, you'll know how to read these inputs, integrate them step by step, and compare them to choose the right interaction method based on best practices and design goals. Join on Zoom28Views0likes0Comments💻 What’s New: Meta Horizon OS v85
V85 of the Meta XR SDK brings powerful new tools and improvements for your XR projects. Join Start Mentor Valem for a rundown of the top features and changes you can implement right away. Spatial Test Framework provides automated testing capabilities for validating your experiences across multiple room configurations. Multiplayer Building Blocks now support Photon Fusion 2.1, enabling more robust and scalable multiplayer experiences. Discover advanced AI Building Blocks with image segmentation and bounding box support. Explore enhancements to hand and controller interactions, including unified ray visual indicators and improved locomotion controls. Get guidance on important deprecations and how to future-proof your projects. Hear directly from mentors, ask questions, and see how you can leverage the latest in Meta Horizon OS v85 to elevate your development. Join on Zoom247Views0likes0CommentsRecentering gesture for impaired users
I am building a hand-control-based VR app for users with impaired mobility. I have two challenges related to the pinch-and-hold gesture for recentering. For some users the gesture is exceedingly hard or impossible to perform (false negative). For others, the gesture is sometimes triggered accidentally (false positive). I understand Meta's desire to keep this gesture universal across all third-party apps. Unfortunately, it is not universally viable to all users. I need a solution to this problem or my app will never ship. I am prepared to roll my own recentering system that manipulates the in-game view in response to a hardware "easy button" press. However, to implement this solution I still need to know when an actual pinch-and-hold gesture is performed, so that I can properly recalibrate my own system. Unfortunately I have not found any functioning API or telemetry that might hint that this has happened. I have tried several OpenXR and Meta Core APIs but they all seem to be no-ops on the Quest 3. Can anyone recommend a solution? I'm using Unity 6.3, OpenXR, and the Meta Core SDK. I do not depend on any other Meta SDKs but am willing to add them if they solve this problem.16Views0likes0CommentsStringscape: Turning Hand Distance into Pitch
I’m currently building a Quest experience called Stringscape, and I wanted to share the core idea and get feedback from other developers here. The concept is simple: You stretch a glowing “string” between your two hands, and the world-space distance between them controls the pitch. Closer hands → higher pitch Farther apart → lower pitch The experience is designed to be more of a creative playground than a structured music tool. I’d love to hear your thoughts. It’s currently in Early Access on Quest as well if anyone is curious to try it. Thanks!
14Views0likes0Comments➕ What’s New: v83 Meta SDK Update
V83 of the Meta XR SDK introduces major improvements to core XR functionalities. Join Start Mentor Valem to hear a rundown of the top features and changes you can implement in your project right away. See the new AI Building Blocks, enhancements to hand tracking, new locomotion options, physics based hand interactions and more. Join on Zoom675Views2likes6CommentsAccessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.125Views1like0Comments3D Raycast from hands... where is it coming from?
I have a Unity Mixed Reality project... and added the hands and real hands and that's cool. Tossed in a Zombie so I can try to get him to follow me or I can shoot him (finger gun special). Now I want to fixup the ray cast from the hands/controllers so I can interact with game objects from afar... but I'm not even sure where that ray is coming from so I can see the script. My "change" would be to have the ray extend 50m... and return a bunch of "hits" with "target class:" GameObject, yellow [put an explosion effect on that zombie -- if it's hitting the mesh] Interactable, a blue disc appears [press trigger to activate object] something from the 3d depth [depth raycast], an orange disc appears [put bullet hole on that] a scene object [floor/wall], a green [grounded] disc appears (note that that may not be the final terminus -- if there's "more models" outside the window or wall (or maybe you're picking something on the other side of a table).... [code has to see if you can shoot through it] All of course, flat against the object, and usable in code (you might be able to fire a laser through the window, but it won't go through a wall; code will see if that works)... But... I don't know where to look... the ray from the hands does #3... but I don't know where in the Asset Tree it's coming from --it will probably also tell me how to make those discs (is it a Gizmo, or a GameObject?). I figure I can add #1/2 [from the cubes, but I haven't quite figured them out yet either, and #4 [EnvironmentalRayCast [ERC] but I might have to iterate on that one because I don't see a "give me all the hits" from the ERC). Questions: a) Where is this 3d ray coming from in the asset tree so I can learn? b) Is there a good way to "scale" the discs so they're "always ~20px dia" no matter how far away they are? c) It looks like I need to change the shader of my zombie, but I'm not getting the terminology -- it occludes fine (eventually I want the IRL table to occlude it), but I need to say "oh, user picked the bellybutton -- spawn an explosion effect in his gut..." -- and how do you change shaders anyway? I can change materials from the editor, but...?Solved37Views0likes1Comment