Voice and Haptics SDKs not 16kb aligned
I develop AR applications designed to run on both the Quest3 as well as Android/iOS devices in Unity. It would appear as though a couple of the aar/so files in the meta voice SDK and Haptics SDK are not 16kb aligned and cause issues when trying to deploy to the google play store (using the all-in-one package updated to latest at the time of this post, v81). Since both targets (Quest3 and Android) share build settings, it complicates deploying an APK that has these meta libraries in it. The files with issues are: mailto:com.meta.xr.sdk.voice@1613052ece81\Lib\Telemetry\Plugins\SDKTelemetry.aar mailto:com.meta.xr.sdk.voice@1613052ece81\Lib\Wit.ai\Lib\third-party\UnityOpus\Plugins\UnityOpus\Plugins\Androidunityopus.aar mailto:com.meta.xr.sdk.haptics@cd1f215a823c\Plugins\libs\Android\ARM64\libhaptics_sdk.so I did notice in the changelogs for v81 that several libraries were updated for 16kb alignment, it seems as though a few may have been missed. One big thing to note is that the two aar files DO NOT trigger the 16kb warning within the unity editor at build time, so the issue will only be caught once you upload it to the google play store for deployment.13Views0likes0CommentsMeta Voice SDK does not display intents in Understanding Viewer
Hey, I am on a test project in Unity 6.1 with Meta Voice SDK and Wit.ai integration. Sound detection, Wit configuration and Wit training are all set up and work fine, but the Understanding Viewer does not list the (well) trained intents (shows 0 with confidence=0), but lists the trained entities correcly with confidence=1. Without proven link to intents, the response matcher does not work reliably: when adding it from the Viewer's listing by clicking on the intents, the response matcher shows an empty field for the intent. I do not use dictation, conduit is checked and the general procedure for voice control works (best with response events in App Voice Experience). I also retrained the intents several times with at least 5 variations for each utterance. But the intents never show up in Voice Understanding. Any idea?44Views0likes4CommentsMetaVoice SDK
I’m working on a VR application where I’m using the XR Interaction Toolkit for interactions. I have a scene where, on a button click, text is revealed and AI-generated voice is played using Meta Audio SDK TTS (Wit.ai). Everything works perfectly in Play mode — I can hear the text-to-speech audio without any issues. However, after I build the application and run it in the headset, the audio does not play at all. I need help understanding why this is happening and how I can fix it.36Views0likes1CommentVoice SDK failed to compile with UE5.6 Oculus fork.
Hi there! I'm trying to implement Voice SDK into my project but when I install it (with the batch file adressed in documentation), I'm not able to compile project or even start the editor. Anyone using Voice SDK with 5.6 succesfully? Here's the error I'm getting: 1>WitHttpEngineIncludes.cpp.obj : error LNK2019: unresolved external symbol "public: __cdecl FHttpResponseCommon::FHttpResponseCommon(class FHttpRequestCommon const &)" (??0FHttpResponseCommon@@QEAA@AEBVFHttpRequestCommon@@@Z) referenced in function "class SharedPointerInternals::TIntrusiveReferenceController<class FCurlHttpResponse,1> * __cdecl SharedPointerInternals::NewIntrusiveReferenceController<1,class FCurlHttpResponse,class FCurlHttpRequest &>(class FCurlHttpRequest &)" (??$NewIntrusiveReferenceController@$00VFCurlHttpResponse@@AEAVFCurlHttpRequest@@@SharedPointerInternals@@YAPEAV?$TIntrusiveReferenceController@VFCurlHttpResponse@@$00@0@AEAVFCurlHttpRequest@@@Z) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "public: virtual class FString __cdecl FHttpResponseCommon::GetURLParameter(class FString const &)const " (?GetURLParameter@FHttpResponseCommon@@UEBA?AVFString@@AEBV2@@Z) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "public: virtual class FString const & __cdecl FHttpResponseCommon::GetURL(void)const " (?GetURL@FHttpResponseCommon@@UEBAAEBVFString@@XZ) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "public: virtual class FString const & __cdecl FHttpResponseCommon::GetEffectiveURL(void)const " (?GetEffectiveURL@FHttpResponseCommon@@UEBAAEBVFString@@XZ) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "public: virtual enum EHttpRequestStatus::Type __cdecl FHttpResponseCommon::GetStatus(void)const " (?GetStatus@FHttpResponseCommon@@UEBA?AW4Type@EHttpRequestStatus@@XZ) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "public: virtual enum EHttpFailureReason __cdecl FHttpResponseCommon::GetFailureReason(void)const " (?GetFailureReason@FHttpResponseCommon@@UEBA?AW4EHttpFailureReason@@XZ) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "public: virtual int __cdecl FHttpResponseCommon::GetResponseCode(void)const " (?GetResponseCode@FHttpResponseCommon@@UEBAHXZ) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "public: virtual class TStringView<enum FGenericPlatformTypes::UTF8CHAR> __cdecl FHttpResponseCommon::GetContentAsUtf8StringView(void)const " (?GetContentAsUtf8StringView@FHttpResponseCommon@@UEBA?AV?$TStringView@W4UTF8CHAR@FGenericPlatformTypes@@@@XZ) 1>WitHttpEngineIncludes.cpp.obj : error LNK2019: unresolved external symbol "protected: void __cdecl FHttpResponseCommon::SetRequestStatus(enum EHttpRequestStatus::Type)" (?SetRequestStatus@FHttpResponseCommon@@IEAAXW4Type@EHttpRequestStatus@@@Z) referenced in function "protected: void __cdecl FHttpRequestCommon::HandleRequestFailed(void)" (?HandleRequestFailed@FHttpRequestCommon@@IEAAXXZ) 1>WitHttpEngineIncludes.cpp.obj : error LNK2019: unresolved external symbol "protected: void __cdecl FHttpResponseCommon::SetRequestFailureReason(enum EHttpFailureReason)" (?SetRequestFailureReason@FHttpResponseCommon@@IEAAXW4EHttpFailureReason@@@Z) referenced in function "public: virtual void __cdecl FCurlHttpRequest::FinishRequest(void)" (?FinishRequest@FCurlHttpRequest@@UEAAXXZ) 1>WitHttpEngineIncludes.cpp.obj : error LNK2019: unresolved external symbol "protected: void __cdecl FHttpResponseCommon::SetEffectiveURL(class FString const &)" (?SetEffectiveURL@FHttpResponseCommon@@IEAAXAEBVFString@@@Z) referenced in function "protected: void __cdecl FHttpRequestCommon::SetEffectiveURL(class FString const &)" (?SetEffectiveURL@FHttpRequestCommon@@IEAAXAEBVFString@@@Z) 1>WitHttpEngineIncludes.cpp.obj : error LNK2019: unresolved external symbol "protected: void __cdecl FHttpResponseCommon::SetResponseCode(int)" (?SetResponseCode@FHttpResponseCommon@@IEAAXH@Z) referenced in function "protected: void __cdecl FHttpRequestCommon::HandleStatusCodeReceived(int)" (?HandleStatusCodeReceived@FHttpRequestCommon@@IEAAXH@Z) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "public: static wchar_t const * const FHttpConstants::VERSION_2TLS" (?VERSION_2TLS@FHttpConstants@@2QEB_WEB) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "public: static wchar_t const * const FHttpConstants::VERSION_1_1" (?VERSION_1_1@FHttpConstants@@2QEB_WEB) 1>WitHttpEngineIncludes.cpp.obj : error LNK2001: unresolved external symbol "class TAutoConsoleVariable<bool> CVarHttpRemoveRequestUsingHttpThreadPolicyOnHttpThread" (?CVarHttpRemoveRequestUsingHttpThreadPolicyOnHttpThread@@3V?$TAutoConsoleVariable@_N@@A) 1>..\Plugins\Runtime\voicesdk-unreal\Binaries\Win64\UnrealEditor-Wit.dll : fatal error LNK1120: 15 unresolved externals Thank you!22Views0likes0CommentsRecording sound from headset
Hi, I'm developing an app from Unity targeting Meta Quest headsets. I need to record the sound from the nearby however I can't figure out how to. I read the docs from the Voice SDK but it seems like it always sends the recordings to Wit.ai, which is not what I want. How would I go about simply reading sound data from the device, no transcription or anything?90Views1like4CommentsUsing Phonemes in TTS with Meta Voice SDK: Wit.ai, Custom Models, or ONNX in Unity?
Hi all, I'm working on a Unity project where speech technology is central, and I'm facing a hurdle with Meta's Voice SDK. My primary need is to use phonemes directly for text-to-speech (TTS), but I've found that Wit.ai does not support direct IPA (International Phonetic Alphabet) input or return phoneme-level control for TTS. Questions & Discussion Points: Is there any way to use Wit.ai for phoneme or IPA-based TTS, or is this currently unsupported? Are there recommended approaches to integrate speech models based on self-supervised learning (like wav2vec 2.0, HuBERT, or WavLM) with Unity, either alongside or instead of Wit.ai? For complete control over TTS—especially for phoneme-level synthesis—would it make sense to bypass Wit.ai entirely and run a model (converted to ONNX) for inference directly inside Unity? Have others run into similar limitations, and if so, what workflows or toolchains have worked best for you? I’d appreciate any advice/examples for integrating more advanced or flexible TTS pipelines into Unity, especially those compatible with IPA/phoneme input or utilizing state-of-the-art self-supervised models. Thanks!Solved57Views0likes1Comment