Recording:
//Audio sessionAvaudiosession *session = [Avaudiosession sharedinstance];Nserror*sessionerror;/*avaudiosessioncategoryplayandrecord: Recording and playback avaudiosessioncategoryambient: for non-voice-focused apps, mute with mute key and screen off. Avaudiosessioncategorysoloambient: Similar to the avaudiosessioncategoryambient difference is that it stops other apps from playing sounds. Avaudiosessioncategoryplayback: for voice-based applications, it does not mute with the mute key and the screen off. Can play sound in the background Avaudiosessioncategoryrecord: for applications that require recording, in addition to Other system sounds, such as electric ringtones, alarm clocks or calendar reminders, will not be played, only a simple recording function is provided. */[Session Setcategory:avaudiosessioncategoryplayandrecord error:&sessionerror]; [Session SetActive:YESErrorNil];//Recording parametersnsdictionary*setting = [nsdictionaryDictionarywithobjectsandkeys: [NSNumberNUMBERWITHINT:KAUDIOFORMATLINEARPCM], Avformatidkey,//Encoding format[NSNumberNumberwithfloat:8000], Avsampleratekey,//Sample rate[NSNumberNumberwithint:2], Avnumberofchannelskey,//Number of channels[NSNumberNumberwithint: -], Avlinearpcmbitdepthkey,//Sample bits (PCM exclusive)[NSNumberNumberwithbool:NO], avlinearpcmisnoninterleaved,//Whether to allow audio crossover (PCM exclusive)[NSNumberNumberwithbool:NO],avlinearpcmisfloatkey,//Whether the sampled signal is a floating point number (PCM exclusive)[NSNumberNumberwithbool:NO], Avlinearpcmisbigendiankey,//Is the big-endian storage mode (PCM exclusive)[NSNumberNumberwithint:avaudioqualitymax], Avencoderaudioqualitykey,//sound quality Nil]; Self. Audiorecorder. Delegate= Self;//Turn on audio measurement Self. Audiorecorder. meteringenabled=YES;//save path Self. Audiorecorder= [[Avaudiorecorder alloc] initwithurl:[NsurlUrlwithstring:filepath] Settings:setting Error:Nil];//Prepare/start recording[ Self. AudiorecorderPreparetorecord]; [ Self. AudiorecorderRecord];//Pause recording[ Self. AudiorecorderPause];//Stop recording[ Self. AudiorecorderStop];//Delete recording//avaudiorecorderdelegate//when A recording has been finished or stopped. This method was not called if the recorder was stopped due to an interruption. (recording completed)- (void) Audiorecorderdidfinishrecording: (Avaudiorecorder *) Recorder successfully: (BOOL) Flag;//if An error occurs while encoding it'll be reported to the delegate (encoding error)- (void) Audiorecorderencodeerrordidoccur: (Avaudiorecorder *) Recorder error: (Nserror* __nullable) error;//when The audio session has been interrupted and the recorder was recording. The recorded file would be closed. (interrupted)- (void) Audiorecorderbegininterruption: (Avaudiorecorder *) Recorder Ns_deprecated_ios (2_2,8_0);//when The audio session interruption have ended and this recorder had been interrupted while recording (interrupted end)- (void) Audiorecorderendinterruption: (Avaudiorecorder *) Recorder withoptions: (Nsuinteger) Flags Ns_deprecated_ios (6_0,8_0);
Play
Avaudiosession *session = [Avaudiosession sharedinstance];Nserror*sessionerror; [Session Setcategory:avaudiosessioncategoryplayback error:&sessionerror]; [Session SetActive:YESErrorNil];//Turn on close monitoring (the handset plays when close to the ear and the speaker plays when it leaves)[[Uidevice Currentdevice] setproximitymonitoringenabled:YES]; [[NsnotificationcenterDefaultcenter] Addobserver: SelfSelector@selector(Sensorstatechange:) Name:uideviceproximitystatedidchangenotification object:Nil]; Self. Audioplayer= [[Avaudioplayer alloc] initwithcontentsofurl:[NsurlUrlwithstring:filepath] Error:Nil]; Self. Audioplayer. Delegate= Self;//Ready to play/play[ Self. AudioplayerPreparetoplay]; [ Self. AudioplayerPlay];//Stop playing[ Self. AudioplayerStop];//Pause playback[ Self. AudioplayerPause];//proximitystatechange: (Nsnotificationcenter *) Notification methodif([[Uidevice currentdevice] proximitystate] = =YES) {//close to ear[[avaudiosession sharedinstance] Setcategory:avaudiosessioncategoryplayandrecord error:Nil];}Else{//Leave the ear[[avaudiosession sharedinstance] Setcategory:avaudiosessioncategoryplayback error:Nil];}//avaudioplayerdelegate//when a sound has finished playing. This method isn't called if the player is stopped due to an interruption (play done)- (void) Audioplayerdidfinishplaying: (Avaudioplayer *) player successfully: (BOOL) Flag;//if An error occurs while decoding it'll be reported to the delegate. (end of decoding)- (void) Audioplayerdecodeerrordidoccur: (Avaudioplayer *) Player error: (Nserror* __nullable) error;//when The audio session has been interrupted while the player is playing. The player would have been paused (interrupted)- (void) Audioplayerbegininterruption: (Avaudioplayer *) player Ns_deprecated_ios (2_2,8_0);//when The audio session interruption have ended and this player had been interrupted while playing (interrupted end)- (void) Audioplayerendinterruption: (Avaudioplayer *) player withoptions: (Nsuinteger) Flags Ns_deprecated_ios (6_0,8_0);
ClipThe audio file under path filepath is truncated from time to time2 and output in Resultpath
Avurlasset is a subclass of Avasset, and the Avasset class is designed to obtain information about multimedia, including access to multimedia images, Sound and so on. The Avurlasset subclass, in turn, initializes the Avasset object based on the nsurl. Avurlasset *videoasset = [Avurlasset assetwithurl:[nsurl fileurlwithpath:filepath]];//Audio Output Session//avassetexportpresetapplem4a:thisExportoption would produce an audio-only. m4a file with appropriate iTunes gapless playback data (output audio, and is. m4a format) avassetexportsess Ion *ExportSession = [AvassetexportsessionExportSessionwithasset:videoasset presetname:avassetexportpresetapplem4a];//Setting output path/file type/intercept time periodExportSession.outputurl = [Nsurl Fileurlwithpath:resultpath];ExportSession.outputfiletype = avfiletypeapplem4a;ExportSession.timerange = Cmtimerangefromtimetotime (Cmtimemake (time1,1), Cmtimemake (Time2,1)); [ExportSessionExportasynchronouslywithcompletionhandler:^{//exporesession.status}];
SyntheticCombining audio from Path filePath1 and path filePath2
the function of the//avurlasset subclass is to initialize the Avasset object according to the Nsurl.Avurlasset *videoasset1 = [[Avurlasset alloc] initwithurl:[NsurlFILEURLWITHPATH:FILEPATH1] Options:Nil]; Avurlasset *videoasset2 = [[Avurlasset alloc] initwithurl:[NsurlFILEURLWITHPATH:FILEPATH2] Options:Nil];//Audio track (general video has at least 2 tracks, one playback sound, one playback screen. Audio has one)Avassettrack *assettrack1 = [[VideoAsset1 Trackswithmediatype:avmediatypeaudio] Objectatindex:0]; Avassettrack *assettrack2 = [[VideoAsset2 Trackswithmediatype:avmediatypeaudio] Objectatindex:0];//avmutablecomposition used to synthesize video or audioAvmutablecomposition *composition = [avmutablecomposition composition]; Avmutablecompositiontrack *compositiontrack = [Composition Addmutabletrackwithmediatype:avmediatypeaudio Preferredtrackid:kcmpersistenttrackid_invalid];//Add the second recording to the back of the first paragraph[Compositiontrack Inserttimerange:cmtimerangemake (Kcmtimezero, VideoAsset1. Duration) Oftrack:assettrack1 Attime:kcmtimezero Error:Nil]; [Compositiontrack Inserttimerange:cmtimerangemake (Kcmtimezero, VideoAsset2. Duration) Oftrack:assettrack2 Attime:videoasset1. DurationErrorNil];//OutputAvassetexportsession *exporesession = [avassetexportsession exportsessionwithasset:composition presetName: Avassetexportpresetapplem4a];exporesession. Outputfiletype= Avfiletypeapplem4a;exporesession. Outputurl= [NsurlFileurlwithpath:resultpath]; [Exporesession exportasynchronouslywithcompletionhandler:^{//exporesession.status}];
compression transcodingDownload Lame (Lame Aint an MP3 Encoder)
Double-click Unzip and place under a folder, the folder needs to be named
lame
, or you cannot generate
.h
And
.a
File usage
Terminal
Enter the folder, compile and build the static library, script code
$:cd cd /Users/mac/Desktop/lame//创建build_lame.sh$:touch build_lame.sh//打开build_lame.sh,粘贴脚本代码$:open build_lame.sh//编译执行脚本,生成静态库,需要输入本机密码$:sudo sh build_lame.sh
Will
fat-lame
folder under the
include
Folders, and
lib
folder into the project, and then write a
OC
The class call
lame.h
@try{intRead, write; FILE *PCM = fopen ([FilePath cstringusingencoding:1],"RB");//converted audio file locationFseek (PCM,4*1024x768, seek_cur); FILE *mp3 = fopen ([Resultpath cstringusingencoding:1],"WB");//generated MP3 file location Const intPcm_size =8192;Const intMp3_size =8192; Short intpcm_buffer[pcm_size*2];unsigned CharMp3_buffer[mp3_size];//Initialize LAME encoderlame_t lame = Lame_init ();//Set LAME MP3 encoded sample rate/Channel count/bit rateLame_set_in_samplerate (Lame,8000); Lame_set_num_channels (Lame,2); Lame_set_out_samplerate (Lame,8000); Lame_set_brate (Lame,8);//MP3 audio quality. 0~9. of which 0 is the best, very slow, 9 is the worst.Lame_set_quality (Lame,7);//Set MP3 encoding methodLAME_SET_VBR (Lame, vbr_default); Lame_init_params (lame); Do{size_t size = (size_t) (2*sizeof( Short int)); Read = Fread (pcm_buffer, size, pcm_size, PCM);if(Read = =0{write = Lame_encode_flush (lame, mp3_buffer, mp3_size); }Else{write = lame_encode_buffer_interleaved (lame, pcm_buffer, read, Mp3_buffer, mp3_size); } fwrite (Mp3_buffer, Write,1, mp3); } while(Read! =0); Lame_close (lame); Fclose (mp3); Fclose (PCM);}@catch(NSException*exception) {NSLog(@"%@", [exception description]);}@finally{//transcoding complete returnResultpath;}
Basically, you can compress about 100K of audio files to below 10K.
Reference 1 Reference 2 reference 3 reference 4
iOS audio Development (recording + play + clip + compositing + compression transcoding)