First, the Hkust flying open platform:
http://www.xfyun.cn/
Register, log in and create a new app.
Because this project only implements voice dictation, check the Voice Dictation single SDK in the SDK Download Center.
Development platform Select iOS, app Select the app you want to implement voice dictation, then click the "Download SDK" button
The program will use AppID, the Program Import SDK must be associated with the application of the SDK, downloaded the SDK compression package is named after the AppID end.
Second, the project configuration
Official Document: Http://www.xfyun.cn/doccenter/iOS
1. Add a static library
Follow the iOS libraries you need to add the SDK, and be aware that libz.dylib,coretelephoney.framework is not missing.
Note: If you are using offline identification, you also need to add libc++.dylib.
2. Initialization
The speech service must be initialized before it can be used, and initialization is an asynchronous process, which is recommended at the entrance of the program.
AppID is the identity information of the app and is unique, and must be passed in to AppID when initialized. You can view this information from the demo's Definition.h Appid_value. Demo and SDK application address: http://xfyun.cn
Initialize in APPDELEGATE.M's didfinishlaunchingwithoptions: Method
First import the header file:
#import "Iflymsc/iflyspeechutility.h"
1-(BOOL) Application: (UIApplication *) application didfinishlaunchingwithoptions: (Nsdictionary *) launchoptions {2 //Override point for customization after application launch.3 //sign in to the Iflytek voice platform4NSString *initstring = [[NSString alloc] Initwithformat:@"appid=%@",@"5750da0e"];5 [iflyspeechutility createutility:initstring];6 returnYES;7}
3. Realization of voice dictation (voice conversion to text):
1 //First step: Introduce the library file2 //the interface file of the Iflytek voice recognition function callback method3 #import<iflyMSC/IFlyRecognizerViewDelegate.h>4 //The Voice recognition view of the Iflytek speech recognition function5 #import<iflyMSC/IFlyRecognizerView.h>6 //the constants defined in the Iflytek speech recognition function7 #import<iflyMSC/IFlySpeechConstant.h>8 9 //Follow the proxy agreementTen @interfaceFirstviewcontroller () <IFlyRecognizerViewDelegate> One A@property (Weak, nonatomic) iboutlet Uitextview *Wordtextview; - - ///Speech Recognition Objects the@property (nonatomic, strong) Iflyrecognizerview *Iflyrecognizerview; - - ///variable string receive related results -@property (nonatomic, copy) nsmutablestring *ResultStr; + @end - + @implementationFirstviewcontroller A at- (void) Viewdidload { - [Super Viewdidload]; - /** - Speech Recognition Text - */ - //initializing speech recognition controls inSelf.iflyrecognizerview =[[Iflyrecognizerview alloc] initWithCenter:self.view.center]; - //Set up proxy toSelf.iflyrecognizerview.Delegate=Self ; + - //set speech recognition results to be used as plain text fields the[Self.iflyrecognizerview Setparameter:@"IAT"forkey:[iflyspeechconstant Ifly_domain]]; * //set the front point detection time to 6000ms $[Self.iflyrecognizerview Setparameter:@"6000"forkey:[iflyspeechconstant Vad_bos]];Panax Notoginseng //after setting the endpoint detection time is 700ms -[Self.iflyrecognizerview Setparameter:@" the"forkey:[iflyspeechconstant Vad_eos]]; the //set the sample rate to 8000 +[Self.iflyrecognizerview Setparameter:@"8000"forkey:[iflyspeechconstant Sample_rate]]; A //set to return included punctuation in results the[Self.iflyrecognizerview Setparameter:@"1"forkey:[iflyspeechconstant Asr_ptt]]; + //to set the return data structure type XML after speech recognition is complete -[Self.iflyrecognizerview Setparameter:@"Plain"forkey:[iflyspeechconstant Result_type]]; $ //set the file name to be cached in the Documents folder named Temp.asr $[Self.iflyrecognizerview Setparameter:@"Temp.asr"forkey:[iflyspeechconstant Asr_audio_path]]; - //to set custom parameters -[Self.iflyrecognizerview Setparameter:@"Custom"forkey:[iflyspeechconstant PARAMS]]; the - Wuyi } the - #pragmaMark-Speech Recognition text Wu-(Ibaction) Recognizeaction: (UIButton *) Sender { - About //Start recognition speech $ [Self.iflyrecognizerview start]; - - } - A + #pragmaMark-Proxy method the /*! - * Callback returns recognition results $ * the * @param resultarray recognition results, the first element of Nsarray is nsdictionary,nsdictionary key for recognition results, SC is the confidence degree of the recognition result the * @param islast-[out] is the last result the */ the //Success -- (void) Onresult: (Nsarray *) resultarray islast: (BOOL) islast { inSelf.resultstr =[[Nsmutablestring alloc] init]; theNsdictionary *dic = [Resultarray objectatindex:0]; the About for(NSString *keyinchdic) the { the[Self.resultstr AppendFormat:@"%@", key]; the } +NSLog (@"%@---------", _resultstr); - theSelf.wordTextView.text = [NSString stringWithFormat:@"%@%@", Self.wordtextview.text,self.resultstr];Bayi } the the /*! - * Recognition End callback - * the * @param error identification end code the */ the //failed the- (void) OnError: (Iflyspeecherror *) Error { -NSLog (@"%@", error); the } the the @end
UI Advanced Iflytek (1) Voice dictation (voice translated into text)