Recently did a project, I put the core functions of the one to share with you, the focus is to comb it yourself.
Here about the video transcoding store I've sorted out two methods, both of which are processed for the video in the album.
1, the method does not compress the video, just take the video intact from the album and put it into the sandbox path, the purpose is to get the video nsdata to upload
Here I pass a URL, this URL is a bit special, is the album file URL, so I said only for the album video Processing
Convert the URL of the original video to NSData data, write sandbox + (void) Videowithurl: (nsstring *) URL withfilename: (NSString *) fileName { alassetslibrary *assetlibrary = [[Alassetslibrary alloc] INIT]; dispatch_async (Dispatch_get_global_queue (dispatch_queue_priority_ DEFAULT, 0), ^{ if (URL) { [assetlibrary Assetforurl:[nsurl Urlwithstring:url] Resultb lock:^ (Alasset *asset) { alassetrepres Entation *rep = [Asset defaultrepresentation]; nsstring *pathdocuments = [Nssearchpath Fordirectoriesindomains (NSDocumentDirectory, Nsuserdomainmask, YES) objectatindex:0]; &NBSp nsstring *imagepath = [NSString stringwithformat:@ "%@/image", pathdocuments]; nsstring *dbfilepath = [ImagePath stringbyappendingpathcomponent:filename]; char const *CVIDEOPATH = [Dbfilepath utf8string]; file *file = fopen (Cvideopath, "A +"); if (file) { const int buffersize = 11024 * 1024; //initialize a 1M buffer byte *buffer = (byte*) malloc (buffersize); nsuinteger Read = 0, offset = 0, written = 0; nserror* ER R = Nil;  IF (rep.siz E! = 0) { do { / & nbsp read = [Rep getbytes:buffer fromoffset:offset Length:buffersize error:&err]; , &N Bsp written = fwrite (buffer, sizeof (char), read, file); &NBsp , &N Bsp offset + = read; , &N Bsp } while (read! = 0 &&!err);//no end, no error, OK continue } //free buffer, Close file free ( Buffer); buffer = N ULL; fclose (fil e); file = NULL; }  } Failureblock:nil]; } });
2, it is recommended to use this method, the method of compression processing of video, the degree of compression can be adjusted
Here I pass the model of the past, will my URL with the past, and then compression completed with the model to bring out the NSData, the data everyone according to their own needs free play
+ (void) Convertvideowithmodel: (Rzprojectfilemodel *) model { model.filename = [NSString Strin gwithformat:@ "%ld.mp4", Randomnum]; //Save to Sandbox path nsstring *pathdocuments = [Nssearchpathfordirectoriesindomains ( NSDocumentDirectory, Nsuserdomainmask, YES) objectatindex:0]; nsstring *videopath = [NSString stringwithformat:@ "%@/image", pathdocuments]; model.sandboxfilepath = [Videopath stringByAppendingPathComponent:model.filename]; //transcoding configuration avurlasset *asset = [Avurlasset URLAssetWithURL:model.assetFilePath options:nil]; avassetexportsession *exportsession= [[Avassetexportsession alloc] Initwithasset:asset presetName: Avassetexportpresetmediumquality]; exportsession.shouldoptimizefornetworkuse = YES; exportsession.outputurl = [Nsurl FileURLWithPath:model.sandBoxFilePath]; exportsession. Outputfiletype = AVFileTypeMPEG4; [exportsession exportasynchronouslywithcompletionhandler:^{ & nbsp int exportstatus = exportsession.status; rzlog (@ "%d", exportstatus); switch (exportstatus) { case avassetexportsessionstatusfailed: { & nbsp //log error to text view nserror *exporterror = Exportsession.error; nslog (@ "avassetexportsessionstatusfailed:%@", exporterror); break; } and nbsp case avassetexportsessionstatuscompleted: &NBS P { &N Bsp rzlog (@ "video transcoding success"); nsdata *dat A = [NSData DataWithContentsOfFile:model.sandBoxFilePath]; model.file data = data;   } } }]; }
Here you can modify the compression ratio, Apple has been encapsulated in the official, according to demand adjustment
AVAssetExportSession *exportSession= [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality];
To modify the output type here, it is not a problem to choose MP4 under normal circumstances.
exportSession.outputFileType = AVFileTypeMPEG4;
Mark is compressed with this, image is the picture, 0.4 is the scale, the size is adjustable
model.fileData = UIImageJPEGRepresentation(image, 0.4);
So you're happy to get the nsdata after the transcoding, and then play it for a try
MPMoviePlayerViewController* playerView = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL fileURLWithPath:sandBoxFilePath]]; [superVC presentViewController:playerView animated:YES completion:nil];
Remark
You can see that I use sandbox storage here, and in the next section I'll tidy up the application sandbox with code management.
Update
Recently found a lot of people contacted me, asked me to demo, recently I also organized a bit, currently hanging on GitHub, hope the great God to correct. Https://github.com/Snoopy008/SelectVideoAndConvert
iOS video compression is stored locally and uploaded to server-B