Go to: http://www.sufeinet.com/thread-2488-1-1.html
How to use the CyberLink Voice SDK to add to your iphone app!
1. First, please go to the developer Zone (http://open.voicecloud.cn/developer.php) of the official website of the Flying News. If you haven't registered yet, you'll need to register a developer account first. Click "Free Registration" at the top of the website to register an account with your own mobile phone number.
2. You can also use the SDK only if you have registered a developer account, as you will also need to apply for AppID. After landing to your own personal center to create an application fill in the relevant information, and then wait for approval, because only after the approval, you can download the voice of the audio SDK. As follows:
<ignore_js_op>
3. If your app has been approved, click the SDK download and select the SDK for the platform, such as now select iphone Platform SDK download.
4. Once downloaded, you will see a compressed package, extracted with four files,
<ignore_js_op>
Where sample is the demo of the flight provided, open the project, you will see such a directory structure:
<ignore_js_op>
Run directly, and no errors, only a few warnings, run the interface as follows:
<ignore_js_op>
< ignore_js_op>
5. How do you use it in your own projects? In fact, it's very simple. We also provide relevant development documentation, or more detailed. There are documents available for download in the developer zone and in the download area.
6. Create an iphone project. Once you are done, you need to import the required lib, as follows:
<ignore_js_op>
It should be explained that Iflymsc.framework this lib is in the download of the SDK extracted files under the Lib folder. In the Add Lib interface choose Add Others ..., and select the Lib you downloaded. Click Open to add it correctly.
<ignore_js_op>
7. Description of some functions and configurations:
Import the file in the header file that you need to use the SDK:
- #import "Iflymsc/iflyrecognizecontrol.h"
- #import "Iflymsc/iflysynthesizercontrol.h"
Copy Code
Create a recognition control or composition control:
- Iflyrecognizecontrol *_iflyrecognizecontroller; Knowledge control
- Iflysynthesizercontrol *_iflysynthesizercontrol; Compositing controls
Copy Code
In the implementation of the file initialization control:
- Initializing the language knowledge control
- _iflyrecognizecontroller = [[Iflyrecognizecontrol alloc] Initwithorigin:cgpointmake (+) Initparam:initpara];
- [Self.view Addsubview:_iflyrecognizecontroller];
- Configure
- [_iflyrecognizecontroller setengine:@ "SMS" Engineparam:nil Grammarid:nil];
- [_iflyrecognizecontroller setsamplerate:16000];
- [_iflyrecognizecontroller setdelegate:self];
- [_iflyrecognizecontroller Setshowlog:no];
- Register Unactive Event
- [[Nsnotificationcenter Defaultcenter] addobserver:self selector: @selector (resignactive) Name: Uiapplicationwillresignactivenotification Object:nil];
- Initialize the language composition control
- _iflysynthesizercontrol = [[Iflysynthesizercontrol alloc] Initwithorigin:cgpointmake (+) Initparam:initpara];
- Configure
- [_iflysynthesizercontrol setdelegate:self];
- [_iflysynthesizercontrol setvoicename:@ "VIXM"]; Pronunciation Person (Chinese and English Cantonese language)
- [Self.view Addsubview:_iflysynthesizercontrol];
- Show UI
- [_iflysynthesizercontrol Setshowui:yes];
- Show log
- [_iflysynthesizercontrol Setshowlog:no];
- Register Unactive Event
- [[Nsnotificationcenter Defaultcenter] addobserver:self selector: @selector (resignactiveofsynthesizer) Name: Uiapplicationwillresignactivenotification Object:nil];
Copy Code
Some callback functions and methods:
- Knowledge of the end of the function-the number of thread to adjust this function
- -(void) Onrecognizeend: (Iflyrecognizecontrol *) Iflyrecognizecontrol theerror: (int) error
- {
- [_recognizebutton Setenabled:yes];
- [_synthesizerbutton Setenabled:yes];
- NSLog (@ "knowledge ends");
- NSLog (@ "Traffic:%d, download traffic:%d", [Iflyrecognizecontrol Getupflow:false],[iflyrecognizecontrol getdownflow:false]);
- }
- Knowledge return function
- -(void) Onresult: (Iflyrecognizecontrol *) Iflyrecognizecontrol Theresult: (Nsarray *) resultarray
- {
- NSString *strresult = [[Resultarray objectatindex:0] objectforkey:@ "NAME"];
- NSLog (@ "Knowledge of the results are:%@", strresult);
- }
- Start language knowledge
- -(void) onbuttonrecognize
- {
- if ([_iflyrecognizecontroller start]) {
- [_recognizebutton Setenabled:no];
- [_synthesizerbutton Setenabled:no];
- }
- }
- The back mode is not supported in the Unactive event.
- -(void) resignactive
- {
- [_iflyrecognizecontroller Cancel];
- }
- Synthetic back function, execute cancel function The whole conversation ends with this function.
- -(void) Onsynthesizerend: (Iflysynthesizercontrol *) Iflysynthesizercontrol theerror: (int) error
- {
- [_synthesizerbutton Setenabled:yes];
- [_recognizebutton Setenabled:yes];
- NSLog (@ "End of the bundle");
- NSLog (@ "Traffic:%d, download traffic:%d", [Iflysynthesizercontrol Getupflow:false],[iflysynthesizercontrol getdownflow:false]);
- }
- Gain player Cache level
- -(void) onsynthesizerbufferprogress: (float) bufferprogress
- {
- NSLog (@ "pre-buffer:%f", bufferprogress);
- }
- Access to player playback
- -(void) onsynthesizerplayprogress: (float) playprogress
- {
- NSLog (@ "pre-play:%f", playprogress);
- }
- Start tone synthesis
- -(void) Onbuttonsynthesizer
- {
- [_iflysynthesizercontrol settext:@ "Haha, this is just an example of a test synthesis function, you don't have to be too nervous about what happens in particular." Thank you "Params:nil";
- if ([_iflysynthesizercontrol start]) {
- [_recognizebutton Setenabled:no];
- [_synthesizerbutton Setenabled:no];
- } else {
- NSLog (@ "I ' m sorry,start error. ");
- }
- }
- Do not support the back-mode unactive event.
- -(void) Resignactiveofsynthesizer
- {
- NSLog (@ "resignactive");
- [_iflysynthesizercontrol Cancel];
- }
Copy Code
The following methods can be called where you need to start using speech functions, such as:
- _recognizebutton = [UIButton buttonwithtype:uibuttontyperoundedrect];
- _synthesizerbutton = [UIButton buttonwithtype:uibuttontyperoundedrect];
- [_recognizebutton Setframe:cgrectmake (70, 100, 180, 60)];
- [_synthesizerbutton Setframe:cgrectmake (70, 200, 180, 60)];
- [_recognizebutton settitle:@ "Start language knowledge" forstate:uicontrolstatenormal];
- [_synthesizerbutton settitle:@ "Start the synthesis of the language forstate:uicontrolstatenormal];
- [_recognizebutton addtarget:self Action: @selector (Onbuttonrecognize) Forcontrolevents:uicontroleventtouchdown];
- [_synthesizerbutton addtarget:self Action: @selector (Onbuttonsynthesizer) Forcontrolevents:uicontroleventtouchdown ];
- [Self.view Addsubview:_recognizebutton];
- [Self.view Addsubview:_synthesizerbutton];
- Start language knowledge
- -(void) onbuttonrecognize
- {
- if ([_iflyrecognizecontroller start]) {
- [_recognizebutton Setenabled:no];
- [_synthesizerbutton Setenabled:no];
- }
- }
- Start tone synthesis
- -(void) Onbuttonsynthesizer
- {
- [_iflysynthesizercontrol settext:@] This is just an example of a test synthesis function, you don't have to be too nervous about what happens "Params:nil";
- if ([_iflysynthesizercontrol start]) {
- [_recognizebutton Setenabled:no];
- [_synthesizerbutton Setenabled:no];
- } else {
- NSLog (@ "I ' m sorry,start error. ");
- }
- }
Copy Code
8. It is now possible to use speech recognition and compositing functions. As follows:
<ignore_js_op>
< ignore_js_op>
< ignore_js_op>