Unity with azure speech sdk



当我在unity上使用azure语音sdk时,当我在计算机上测试它时,它工作得很好,我可以说话,它可以正常识别和响应语音。

当我为Android和iOS构建时,它不起作用。在iOS和Android上,它都可以在不尝试识别任何东西的情况下提高识别点,如果我只是放一个来自sdk的简单语音,它也不会给出任何东西。

如何解决这个问题?

下面是在unity和windows build中工作的代码:


---------------------------------------------
void Start()
{
anim = gameObject.GetComponent<Animator>();
var config = SpeechConfig.FromSubscription("xxxxxxxxxxxx", "northeurope");

cred(config);
}

async Task cred(SpeechConfig config)
{ 
texttest.GetComponent<Text>().text = config.ToString();
var audioConfig = AudioConfig.FromDefaultMicrophoneInput();
var synthesizer2 = new SpeechRecognizer(config, audioConfig);
var result = await synthesizer2.RecognizeOnceAsync();
var synthesizer = new SpeechSynthesizer(config);
SynthesizeAudioAsync(config, synthesizer2, result);
} 
async Task SynthesizeAudioAsync(SpeechConfig config, SpeechRecognizer synthesizer2, SpeechRecognitionResult result)
{
texttest.GetComponent<Text>().text = "syn1 " + result.Text;
OutputSpeechRecognitionResult(result);
if (result.Reason == ResultReason.RecognizedSpeech)
{
if (result.Text == "xx" || result.Text == "xx" || result.Text == xx." || result.Text == "xx")
{
var synthesizer = new SpeechSynthesizer(config); 

anim.Play("helloAll", 0, 0); 
await synthesizer.SpeakTextAsync("Helloxx");
chooseTopic(config,  synthesizer, result.Text);  

在iOS上,它在控制台上给了我这个:


--------------------------------------------------
CANCELED: Did you set the speech resource key and region values?speakTest:OutputSpeechRecognitionResult(SpeechRecognitionResult)<SynthesizeAudioAsync>d__10:MoveNext()

CANCELED: ErrorDetails=0x15 (SPXERR_MIC_ERROR)
[CALL STACK BEGIN]

3   UnityFramework                      0x0000000109336810 _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxMicrophonePumpBase9StartPumpENSt3__110shared_ptrINS2_18ISpxAudioProcessorEEE + 756
4   UnityFramework                      0x000000010931c010 _ZN9Microsoft17CognitiveServices6Speech4Impl25ISpxDelegateAudioPumpImpl9StartPumpENSt3__110shared_ptrINS2_18ISpxAudioProcessorEEE + 84
5   UnityFramework                      0x000000010932cc0c _ZN9Microsoft17CognitiveServices6Speech4Impl27CSpxAudioPumpDelegateHelperINS2_29CSpxDelegateToSharedPtrHelperINS2_13ISpxAudioPumpELb0EEEE17DelegateStartPumpENSt3__110shared_ptrINS2_18ISpxAudioProcessorEEE + 220
6   UnityFramework                      0x0000000109325e1c _ZN9Microsoft17CognitiveServices6Speech4Impl41ISpxAudioSourceControlAdaptsAudioPumpImplINS2_32CSpxMicrophoneAudioSourceAdapterEE9StartPumpEv + 304
7   UnityFramework                      0x0000000109325664 _ZN9Microsoft17CognitiveServices6Speech4Impl41ISpxAudioSourceControlAdaptsAudioPumpImplINS2_32CSpxMicrophoneAudioSourceAdapterEE10StartAudioENSt3__110shared_ptrINS2_12ISpxNotifyMeIJRKNS7_INS2_15ISpxAudioSourceEEERKNS7_INS2_14ISpxBufferDataEEEEEEEE + 184
8   UnityFramework                      0x00000001093221d4 _ZN9Microsoft17CognitiveServices6Speech4Impl34ISpxAudioSourceControlDelegateImplINS2_29CSpxDelegateToSharedPtrHelperINS2_22ISpxAudioSourceControlELb0EEEE10StartAudioENSt3__110shared_ptrINS2_12ISpxNotifyMeIJRKNS9_INS2_15ISpxAudioSourceEEERKNS9_INS2_14ISpxBufferDataEEEEEEEE + 220
9   UnityFramework                      0x00000001094a41f4 _ZN9Microsoft17CognitiveServices6Speech4Impl28CSpxSessionAudioSourceHelperINS2_20CSpxAudioSessionShimEE16StartAudioSourceERKNSt3__110shared_ptrINS2_15ISpxAudioSourceEEE + 504
10  UnityFramework                      0x00000001094a0dbc _ZN9Microsoft17CognitiveServices6Speech4Impl28CSpxSessionAudioSourceHelperINS2_20CSpxAudioSessionShimEE22EnsureStartAudioSourceEv + 124
11  UnityFramework                      0x0000000109408dcc _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession14StartAudioPumpENS3_15RecognitionKindENSt3__110shared_ptrINS2_12ISpxKwsModelEEE + 2300
12  UnityFramework                      0x0000000109406760 _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession16StartRecognizingENS3_15RecognitionKindENSt3__110shared_ptrINS2_12ISpxKwsModelEEE + 616
13  UnityFramework                      0x0000000109406098 _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession18RecognizeOnceAsyncERKNSt3__110shared_ptrINS3_9OperationEEENS5_INS2_12ISpxKwsModelEEE + 464
14  UnityFramework                      0x0000000109424d4c _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession9OperationC2ENS3_15RecognitionKindE + 1040
15  UnityFramework                      0x0000000109420af4 _ZN9Microsoft17CognitiveServices6Speech4Impl7SpxTermINS2_21ISpxAudioStreamReaderEEEvRKNSt3__110shared_ptrIT_EE + 2004
16  UnityFramework                      0x0000000109354c28 _ZNSt3__113packaged_taskIFvvEEclEv + 96
17  UnityFramework                      0x0000000109354bb4 _ZN9Microsoft17CognitiveServices6Speech4Impl17CSpxThreadService4Task3RunEv + 32
18  UnityFramework                      0x00000001093566fc _ZN9Microsoft17CognitiveServices6Speech4Impl17CSpxThreadService6Thread7RunTaskINSt3__14pairINS6_10shared_ptrINS3_4TaskEEENS6_7promiseIbEEEEEEvRNS6_11unique_lockINS6_5mutexEEERNS6_5dequeIT_NS6_9allocatorISJ_EEEE + 332
19  UnityFramework                      0x0000000109354d8c _ZN9Microsoft17CognitiveServices6Speech4Impl17CSpxThreadService6Thread10WorkerLoopENSt3__110shared_ptrIS4_EE + 216
[CALL STACK END]

这个问题是由iOS和Android的配置问题引起的。查看android和iOS的配置文档。

有一个github的repo可以解决类似的问题。检查一次

https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart/csharp/unity/from-microphone

iOS库将使用Objective - C开发并且不支持任何JavaScript库。同样,对于Android Cordova将用于处理JavaScript库。这两个是不支持每一个相对的平台。因此,请检查开发平台的配置以及支持的库语言。

Cordova Platforms : android 7.1.4 ios 4.5.5
Ionic Framework : ionic-angular 3.9.2
iOS: Objective - C

您报告的问题(SPXERR_MIC_ERROR)表明您没有正确配置麦克风。你可以先尝试Unity快速启动https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart/csharp/unity/from-microphone,测试它在你的Android和iOS设备上的工作,然后将配置应用到你的应用程序。

最新更新