IOS-Voice broadcast based on push messages

Source: Internet
Author: User
Tags set background

At present, many of the aggregate payment apps on the market need to be successful after the payment, voice prompts, such as collect money, Lehui, etc. the company app also needs to implement the change function, mainly divided into 2 parts, one is push, but voice broadcast, the following brief introduction

A push, the current integrated push is mainly Aurora push, integrated Aurora-driven process is relatively simple, the main process is

1. Register your account and register your account at the Aurora Push official website, address: https://www.jiguang.cn/accounts/register/form

2. login account, top right corner Click Create app, fill in App name, upload app icon, click Create

3. Upload the push certificate and do APNs verification

4. Import the Aurora Push SDK, pod ' Jpush '

5. Import in Appdelegete

// the header file required to introduce the Jpush function #import " JPUSHService.h " // iOS10 Register APNs required header file #import <UserNotifications/UserNotifications.h>

Having done the work above is basically already possible to integrate the push in the code

Code:

//integrated Push, sign up-(BOOL) Application: (UIApplication *) application didfinishlaunchingwithoptions: (Nsdictionary *) launchoptions {[Jpushservice registerforremotenotificationconfig:entityDelegate: Self]; [Jpushservice setupwithoption:launchoptions AppKey:@"You AppKey"Channel:nil Apsforproduction:yes]; //Get Registrationid[Jpushservice registrationidcompletionhandler:^ (intRescode, NSString *Registrationid) {    }];}#pragmaMark-Push Proxy method//IOS10 Previous Call---before and after receiving message- (void) Application: (UIApplication *) application didreceiveremotenotification: (nsdictionary *) userInfo Fetchcompletionhandler: (void(^) (Uibackgroundfetchresult)) Completionhandler {//Receive Message}//IOS---foreground display push- (void) Jpushnotificationcenter: (unusernotificationcenter *) Center willpresentnotification: (unnotification *) Notification Withcompletionhandler: (void(^) (Nsinteger)) Completionhandler {//Receive Message}//IOS---Background display push- (void) Jpushnotificationcenter: (unusernotificationcenter *) Center Didreceivenotificationresponse: ( Unnotificationresponse *) Response Withcompletionhandler: (void(^) ()) Completionhandler {//Receive Message}//Get device Devicetoken- (void) Application: (UIApplication *) application Didregisterforremotenotificationswithdevicetoken: (NSData *) devicetoken{[Jpushservice Registerdevicetoken:devicetoken];}

?: When the push is done, you need to make a voice broadcast after the push message is received, and the backend (server) needs to add a field to the integration push

Field: "Content-available": 0

Only by adding this field will you be able to get push content when the app is running in the foreground.

Second voice broadcast

After the push message, the following is the voice broadcast, voice broadcast here there will be a pit, when the app back to the background to run or lock screen state, voice will not broadcast, but will go through the method of the broadcast, then we need to info.plist inside the permission settings. Specific such as:

Or: (more convenient, recommended use)

After this set up, you can write code, there are 2 ways: 1. Integration with third-party SDKs (e.g., Iflytek speech recognition); 2iOS system approach

First import the Avfoundation framework

Then you need to introduce the framework into the project

#import<AVFoundation/AVSpeechSynthesis.h>

1. Iflytek:

Configuration Process Reference Flight website integration process

Website: http://doc.xfyun.cn/msc_ios/302721

Main code:

NSString *initstring = [[NSString alloc] Initwithformat:@"appid=%@",@"59671829"];    [Iflyspeechutility createutility:initstring]; //get a speech synthesis single case_iflyspeechsynthesizer =[Iflyspeechsynthesizer sharedinstance]; //set the Protocol delegate object_iflyspeechsynthesizer.Delegate=Self ; //Set Composition Parameters//set up how to work online[_iflyspeechsynthesizer setparameter:[iflyspeechconstant Type_cloud] Forkey:[ifly    Speechconstant Engine_type]]; //set volume, Value range 0~100[_iflyspeechsynthesizer Setparameter:@" About"forkey: [Iflyspeechconstant VOLUME]]; //pronunciation person, default is "XiaoYan", can set the parameter list can refer to "Synthetic pronunciation person list"[_iflyspeechsynthesizer Setparameter:@"XiaoYan"forkey: [Iflyspeechconstant Voice_name]]; //Save the composition file name, set to nil if no longer needed, or null to cancel, default directory is located under Library/cache[_iflyspeechsynthesizer Setparameter:@"TTS.PCM"forkey: [Iflyspeechconstant Tts_audio_path]]; //Close Log[Iflysetting Showlogcat:no];

2. System method:

In this case the object is best created as a global, easy to control, of course, if you do not need to pause, continue to play and other operations, you can create a local.

{avspeechsynthesizer*av;}

Initialization

//Initializing Objectsav=[[Avspeechsynthesizer Alloc]init];av.Delegate=self;//Hang Up AgentAvspeechsynthesisvoice*voice = [Avspeechsynthesisvoicevoicewithlanguage:@"ZH-CN"];//set the pronunciation, this is Mandarin ChineseAvspeechutterance*utterance = [[Avspeechutterance alloc]initwithstring:@"words that need to be broadcast"];//text that needs to be convertedUtterance.rate=0.6;//set the speed, range 0-1, note 0 slowest, 1 the fastest;Utterance.voice=Voice; [Avspeakutterance:utterance];//Start

Finally, when the background or the lock screen state to broadcast:

The following method is implemented in the Appdelegate. m file:

//Implement Backgroundplayerid: This method:+(Uibackgroundtaskidentifier) Backgroundplayerid: (uibackgroundtaskidentifier) backtaskid{//set up and activate the audio session categoryAvaudiosession *session=[Avaudiosession sharedinstance];    [Session Setcategory:avaudiosessioncategoryplayback Error:nil];    [Session Setactive:yes Error:nil]; //allow applications to receive remote control[[UIApplication sharedapplication] beginreceivingremotecontrolevents]; //Set background task IDUibackgroundtaskidentifier newtaskid=Uibackgroundtaskinvalid; Newtaskid=[[UIApplication sharedapplication] beginbackgroundtaskwithexpirationhandler:nil]; if(newtaskid!=uibackgroundtaskinvalid&&backtaskid!=uibackgroundtaskinvalid)    {[[UIApplication sharedapplication] endbackgroundtask:backtaskid]; }    returnNewtaskid;}

Here can already according to push message to voice broadcast, but Apple audit time another pit, who let Apple is Dad! Well, that's what we added the following permissions in the Info.plist:

To pass the Apple audit, you need to record a video to the Apple auditor, the recording adapter best uploaded to YouTube (I am using), foreign sites need to FQ, or can be uploaded to Youku, but I did not pass, so that the audit can be passed.

If there is a better way please advise, or have questions please contact me. qq:1475074574

IOS-Voice broadcast based on push messages

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.