Voice input API
I'm using Meta horizon worlds desktop editor for my VR Game World: My main goal: A platform that moves up and down based on the player’s voice loudness. Louder voice = higher platform. Silence = platform stays low. And my other goal: when user uses microphone and sings/talks, the game detects that and it triggers something like e.g platform stops moving. How would I do this with Meta horizon worlds desktop editor right now? Does a Voice input API exist or could I use Custom input APIs? Could I use a NPC as a middleman to detect the user talking and use that to trigger something? THANK YOU!7Views0likes0Comments(Unreal Engine) Pinch doesn't work properly in multiplayer.
It's version 5.5.4 and sdk version is being developed as 78.0. I'm developing a case where 3 people multi-play, and different Pawn can be set up each player. For example, one person is DefaultPawn, the other is IsdkSamplePawn, IsdkSamplePawn2, and so on. The two Pawn's are using hand-tracking. The code link that I referenced to make different Pawn is like https://unreal.gg-labs.com/wiki-archives/networking/spawn-different-pawns-for-players-in-multiplayer and this is Unreal 4, so I modify the code to version 5. By the way, if the Player Controller class calls GameMode's RestartPlayer within DeterminePawnClass, Pinch grab action does not work in handtracking. I think it's a bug of RigComponent in InteractionSDK, but I wonder if anyone has solved this problem.Solved54Views0likes2CommentsDoes Meta Quest Pro Eye Tracking Support Eye-Crossing (Convergence) Detection?
I’m using the Movement SDK with the Meta Quest Pro to log gaze position and rotation. Standard eye movements (left–right, up–down) are captured accurately, but eye-crossing / convergence behavior never appears in the data, even when the user intentionally performs strong inward convergence. Before assuming it’s a hardware or SDK limitation, I’m trying to understand how the Quest Pro’s eye-tracking estimation model handles this: Is the Quest Pro capable of detecting and outputting eye-crossing behavior, or is convergence intentionally smoothed or normalized to maintain naturalistic eye-movement patterns for social VR? If anyone has technical insight, documentation references, or experience with similar tests, I’d appreciate the clarification. Thanks!6Views0likes0CommentsDisabled or Suspended Facebook Account
Hi, I hope you are doing well and continuing your vital work safeguarding fairness and accountability in the digital age. My name is Mohammed Faiyazul Islam Hossain, and I am respectfully reaching out regarding the suspension of my Facebook account, registered under, [ username @LivestocksSaviour ] This account is the official communication hub for Animal Rescuers Chittagong - ARC, a volunteer-based organization I founded to rescue injured animals, organize emergency responses, and promote compassion throughout (Bangladesh). Every day, it connects rescuers, veterinarians, and citizens who believe in kindness as a shared responsibility. On July 16, 2025, my account was disabled without any notice or proof of policy violation. Since then, I have followed every available appeal procedure through Meta’s systems, yet all responses have been automated, circular, and inconclusive. The appeal links provided are not functional in my region, leaving me locked in a digital loop with no access to a fair, human-led review. This suspension has halted rescue operations, disrupted communication between volunteers, and prevented emergency coordination. Beyond the organizational impact, it has been personally devastating to see years of compassionate work vanish into an algorithmic void - without context, conversation, or humanity. Under Articles 16, 17, 18, and 22 of the GDPR, individuals have the right to accuracy, correction, restriction, and protection against decisions made solely by automated means. The current handling of my account appears to have violated these principles by enforcing a decision without transparent reasoning or manual assessment. I am therefore making a formal request for human intervention and oversight - a fair review by a qualified compliance officer within Meta, under your regulatory guidance. This is not a plea for special treatment; it is a request for the application of fairness, accuracy, and dignity that every citizen deserves under European data protection law. I am confident that a brief human review will clarify that our activities are entirely humanitarian and educational, not commercial or harmful. Restoring access would immediately allow us to continue life-saving rescue work and restore public confidence in responsible data governance. Thank you deeply for your time, integrity, and compassion in considering this request. I trust that your intervention will help bring closure to an issue that technology alone cannot resolve. I remain available for any documentation or verification you may need. With sincere respect and gratitude, Account URLSolved@meta v81 broke mixed reality
meta This is not a subjective opinion - bring back the option to disable/enable "Double Tap for Passthrough", forcing it on with no way to disable ruins mixed reality games and limits the creativity of developers working in the MR space. I understand for some applications it may be useful, for others it is the opposite - just bring back the toggle option in the settings, there was no reason to remove it and certainly no reason to FORCE it to be on. Having it as a toggle in the settings as it was in previous patches - was the perfect move. This patch is a regression. v81 introduced a forceful setting "Double Tap for Passthrough" which was in previous patches optional and toggleable by the user. It is now forced ON with no way to disable in the settings. This has broke SEVERAL of the mixed reality games that I play. Some games literally require that I touch my head or bring my hands to my head or shake my head which now forcefully PAUSES the game and renders the entire experience unplayable. I thought this would be resolved with the new hot patch for v81 I waited so eagerly for - it didn't, must have fallen under the radar. I will be reposting this in every forum so that it may hopefully gather the attention of the appropriate developers at Meta. Thank you for otherwise creating an amazing product but until this is patched I and many others will still be unable to use the headset for various applications that were previously working perfectly.12Views0likes0CommentsText-to-speech feature for users with congenital mutism
Hello, I would like to suggest adding a text-to-speech feature to WhatsApp specifically designed for deaf and mute users. This feature would allow users to type messages and have them spoken aloud through a customizable voice, with options to select characteristics like gender and emotional tone. Such personalization is crucial, as intonation can dramatically alter the meaning and intent of a message—something text and emojis alone often fail to fully capture, that non-mute users overcome by using the voice-recording option. Currently, Meta AI on WhatsApp offers only text-based responses, which limits accessibility for users who rely on spoken communication. According to the World Health Organization, approximately 70 million people worldwide are deaf and mute, many of whom use smartphones and WhatsApp regularly. By integrating this feature, WhatsApp has the opportunity to significantly improve accessibility and inclusivity for a large and underserved population. Meta already has the tools required in their own ecosystem, with Audiobox. This enhancement would transform communication for millions, allowing them to express themselves more fully and engage more naturally with others.Is it possible to use hand tracking with just one hand?
Hi everyone! I’ve been experimenting with ways to give players a more super-real experience. After attending a gesture recognition session yesterday, I started thinking about removing the hand controllers entirely, but there’s a problem: without them, I can’t let players move through hand tracking alone. My game includes boss fights that require precise movement to dodge attacks. So I’m wondering, is it currently possible to have one hand use gesture tracking while the other hand uses a controller joystick for movement?Solved37Views0likes2CommentsPC Link disconnects when stopping Play mode in Unity
Hi, I’m developing a Meta Quest app and recently ran into an issue with PC Link disconnecting. Until about a month ago, stopping Play mode in Unity did not cause PC Link to disconnect. But now, every time I stop Play mode, the connection drops. Here is the exact behavior: Connect Meta Quest to my PC using PC Link Press Play in Unity The game streams correctly to the headset Stop Play mode in Unity PC Link immediately disconnects Environment: Windows 10 (ESU applied) Meta Quest + PC Link both on v81 USB-A to USB-C cable (USB 3.2, data-capable) Also tested with another 2m cable This issue started more than a month ago Unity / XR Setup: Using both Unity OpenXR and Meta OpenXR I’ve already checked everything in the initial diagnostics checklist: PC Link version matches the headset’s HorizonOS version PC meets minimum requirements (recommended specs or higher) Using the correct USB-C/USB-A port Cable supports USB 3.2 and data transfer Latest GPU drivers installed Geforce Experience / AMD Radeon Adrenaline uninstalled Not using onboard graphics Despite all these steps, PC Link continues to disconnect the moment Unity stops Play mode. If anyone has seen this behavior or knows what might have changed recently, any advice would be greatly appreciated. Thanks!24Views0likes0Comments