Voice SDK Dictation Not Working in Unity 2023.2+
I upgraded my project to Unity 6 from Unity 2022 LTS. Here's my current setup: Unity 6000.0.27f1 Meta XR Core SDK 69.0.1 Meta XR Interaction SDK Essentials 69.0.1 Meta XR Voice SDK 69.0.0 With this setup, the Core SDK requires the use of GameActivity, but the Voice SDK's dictation functionality requires Activity. As a result, dictation cannot be executed on Unity 2023.2+. This is an error on Quest 3. Is this a known issue? Or is there a possible workaround? Here’s a related information for reference: https://communityforums.atmeta.com/t5/Unity-Development/Important-Unity-6-information/td-p/1252153 https://discussions.unity.com/t/voice-sdk-crash-on-activate-meta-quest-3/935246Solved2.7KViews2likes9CommentsError on Build with Meta Voice SDK (HideFlags.DontSave)
I am using the Meta Voice SDK and am having trouble building my project. I get the following errors (the first is the primary). Have others encountered this? An asset is marked with HideFlags.DontSave but is included in the build: Asset: 'Packages/com.meta.xr.sdk.voice/Lib/Wit.ai/Resources/witai.png' Asset name: witai (You are probably referencing internal Unity data in your build.) UnityEngine.GUIUtility:ProcessEvent (int,intptr,bool&) Here's the second error: Assertion failed on expression: 'm_LockCount == 0' UnityEngine.GUIUtility:ProcessEvent (int,intptr,bool&) And the third error: Building - Failed to write file: resources.assets UnityEngine.GUIUtility:ProcessEvent (int,intptr,bool&) Does anyone know how to edit the HideFlags.DontSave (assuming that's the problem)? Or is there another way to address this? Thank you!2.1KViews1like4CommentsVoice SDK does not work on the quest but does on Unity editor
Hi, I am making a game for the quest with Unity and for a trial I have a microphone with a button that you have to press and say a specific phrase. For that I am using the Voice SDK but I have a problem, it works in the editor but not once I compile and install the apk, I have the microphone permission so that is not the issue. My GameObject that acts as a button is this: And the Microfono script is like this: private void OnTriggerEnter(Collider other) { if (other.tag == "dedo" && !triggerActivated) { triggerActivated = true; appVoice.Activate(GetRequestEvents()); } } // Set the events for the Voice private VoiceServiceRequestEvents GetRequestEvents() { VoiceServiceRequestEvents events = new VoiceServiceRequestEvents(); events.OnInit.AddListener(OnInit); events.OnComplete.AddListener(OnComplete); return events; } // What happens when the button is pressed public void OnInit(VoiceServiceRequest request){ micro.GetComponent<Outline>().enabled = false; verde.SetActive(true); rojo.SetActive(false); } // What happens when the transcription is complete public void OnComplete(VoiceServiceRequest request){ triggerActivated = false; verde.SetActive(false); rojo.SetActive(true); } The OnTriggerEnter method does work, I tried to place the verde.SetActive(true); before the appVoice activation and it did activate a green light when touching the microphone. Also the GameObject has a child GameObject with the response script (maybe it could go into the parent) like this: Any ideas why this might work on the editor and not on the quest 3? It does not even launch the OnInit event because this line: micro.GetComponent<Outline>().enabled = false; should remove the outline around the object and it does not work. Some info that I forgot and may be usefull, I use Meta XR All-in-One SDK version 62 and Unity 2022.3747Views1like0CommentsVoice SDK V56 not working on Quest (Android)
Hello dear friends, I recently downloaded the Phanto project to test out the new room scanning capabilities. I added the Voice SDK to this project V56, and this works great in the editor. However, when I make an Android build for the Quest 3 it seems very much so that none of the Android-specific code works properly. It doesn't seem to initialize the voiceServiceImpl or the logger. Now since it works on PC without any issues I'm 100% sure it's a small fix but for the life of me I can't figure it out. Code in InitSDK: #if UNITY_ANDROID && !UNITY_EDITOR VLog.I("Android specific code"); if (UsePlatformIntegrations) { VLog.I("We are using platform integrations"); Debug.Log("Checking platform capabilities..."); var platformImpl = new VoiceSDKImpl(this); platformImpl.OnServiceNotAvailableEvent += () => RevertToWitUnity(); platformImpl.Connect(PACKAGE_VERSION); platformImpl.SetRuntimeConfiguration(RuntimeConfiguration); if (platformImpl.PlatformSupportsWit) { voiceServiceImpl = platformImpl; VLog.I("Supports WIT"); if (voiceServiceImpl is Wit wit) { wit.RuntimeConfiguration = witRuntimeConfiguration; VLog.I("Setting Runtime Config"); } voiceServiceImpl.VoiceEvents = VoiceEvents; voiceServiceImpl.TelemetryEvents = TelemetryEvents; } else { VLog.I("Platform registration indicated platform support is not currently available."); Debug.Log("Platform registration indicated platform support is not currently available."); RevertToWitUnity(); } if(voiceServiceImpl==null) { VLog.I("Voice Service Impl Failed"); } } else { VLog.I("No platform **bleep**"); RevertToWitUnity(); } #else Logcat: As you can see it doesn't log anything anymore after 'using platform integrations' so something goes wrong any help would be appreciated.1KViews0likes1Comment