Previous time project requirements need to be in the chat module to add similar micro-letter video features, this blog is mainly to summarize the problems encountered and solutions, I hope to have the same needs of friends help.
Effect preview:
Here's a list of the major problems you'll encounter:
1. Video clip micro-letter video just took a part of the camera to get the picture
2. Scrolling preview of the cotton Problem Avplayer play video in the scrolling will appear very card problem
Let's move on to the next step.
Part 1 implementing video recording
1. Recording Class Wkmovierecorder implementation
Create a recording class Wkmovierecorder that is responsible for video recording.
@interface wkmovierecorder:nsobject
+ (wkmovierecorder*) Sharedrecorder;
-(Instancetype) Initwithmaxduration: (nstimeinterval) duration;
@end
Define callback Block
/**
* Recording End
*
@param info callback Information
* @param iscancle YES: Cancel NO: Normal end
/typedef void (^ Finishrecordingblock) (Nsdictionary *info, Wkrecorderfinishedreason Finishreason);
/**
* Focus Change
/
typedef void (^focusareadidchanged) ();
/**
* Permission Validation
*
@param success is successful/
typedef void (^authorizationresult) (BOOL success);
@interface Wkmovierecorder:nsobject
//Callback
@property (nonatomic, copy) Finishrecordingblock finishblock;// Recording End callback
@property (nonatomic, copy) focusareadidchanged Focusareadidchangedblock;
@property (nonatomic, copy) Authorizationresult Authorizationresultblock;
@end
Define a cropsize for video cropping
@property (nonatomic, assign) Cgsize cropsize;
The next is the implementation of the capture, where the code is a bit long, lazy to see can directly look at the back of the video clipping section
Recording configuration:
@interface Wkmovierecorder () < Avcapturevideodataoutputsamplebufferdelegate,
Avcaptureaudiodataoutputsamplebufferdelegate, wkmoviewriterdelegate > {avcapturesession* _session;
avcapturevideopreviewlayer* _preview;
wkmoviewriter* _writer;
Pause recording BOOL _iscapturing;
BOOL _ispaused;
BOOL _discont;
int _currentfile;
Cmtime _timeoffset;
Cmtime _lastvideo;
Cmtime _lastaudio;
Nstimeinterval _maxduration;
}//Session management.
@property (nonatomic, strong) dispatch_queue_t Sessionqueue;
@property (nonatomic, strong) dispatch_queue_t Videodataoutputqueue;
@property (nonatomic, strong) avcapturesession *session;
@property (nonatomic, strong) Avcapturedevice *capturedevice;
@property (nonatomic, strong) Avcapturedeviceinput *videodeviceinput;
@property (nonatomic, strong) Avcapturestillimageoutput *stillimageoutput;
@property (nonatomic, strong) avcaptureconnection *videoconnection;
@property (nonatomic, strong) avcaptureconnection *audioconnection; @properTy (nonatomic, strong) Nsdictionary *videocompressionsettings;
@property (nonatomic, strong) Nsdictionary *audiocompressionsettings;
@property (nonatomic, strong) Avassetwriterinputpixelbufferadaptor *adaptor;
@property (nonatomic, strong) Avcapturevideodataoutput *videodataoutput; Utilities @property (nonatomic, strong) Nsmutablearray *frames;//store recorded frame @property (nonatomic, assign)
Captureavsetupresult result;
@property (Atomic, readwrite) BOOL iscapturing;
@property (Atomic, readwrite) BOOL ispaused;
@property (nonatomic, strong) Nstimer *durationtimer;
@property (nonatomic, assign) Wkrecorderfinishedreason Finishreason;
@end
Instantiation method:
+ (Wkmovierecorder *) sharedrecorder
{
static wkmovierecorder *recorder;
Static dispatch_once_t Oncetoken;
Dispatch_once (&oncetoken, ^{
recorder = [[Wkmovierecorder alloc] initwithmaxduration:cgfloat_max];
});
return recorder;
}
-(Instancetype) Initwithmaxduration: (nstimeinterval) duration
{
if (self = [self init]) {
_maxduration = Duration;
_duration = 0.f;
}
return self;
}
-(instancetype) init
{
self = [super init];
if (self) {
_maxduration = Cgfloat_max;
_duration = 0.F;
_sessionqueue = Dispatch_queue_create ("Wukong.movieRecorder.queue", dispatch_queue_serial);
_videodataoutputqueue = Dispatch_queue_create ("Wukong.movieRecorder.video", dispatch_queue_serial);
Dispatch_set_target_queue (_videodataoutputqueue, Dispatch_get_global_queue (Dispatch_queue_priority_high, 0));
return
self;
}
2. Initialization settings
Initialization settings are session creation, permission checking, and session configuration, respectively
1). Session creation
Self.session = [[Avcapturesession alloc] init];
Self.result = captureavsetupresultsuccess;
2). Permission Check
Permission check
switch ([Avcapturedevice Authorizationstatusformediatype:avmediatypevideo]) {
case avauthorizationstatusnotdetermined: {
[Avcapturedevice requestaccessformediatype:avmediatypevideo completionhandler:^ (BOOL granted) {
if (granted) {
self.result = captureavsetupresultsuccess;
}
}];
break;
Case avauthorizationstatusauthorized: {break
;
}
default:{
self.result = captureavsetupresultcameranotauthorized
}
}
if (Self.result!= captureavsetupresultsuccess) {
if (self.authorizationresultblock) {
Self.authorizationresultblock (NO);
}
return;
}
3). Session Configuration
The session configuration is to be aware that the avcapturesession configuration cannot be in the main thread, and that it needs to create its own serial thread.
3.1.1 Get input device and input stream
Avcapturedevice *capturedevice = [[Self class] Devicewithmediatype:avmediatypevideo preferringposition: Avcapturedevicepositionback];
_capturedevice = Capturedevice;
Nserror *error = nil;
_videodeviceinput = [[Avcapturedeviceinput alloc] Initwithdevice:capturedevice error:&error];
if (!_videodeviceinput) {
NSLog (@ "Device not found");
3.1.2 Recording Frame number setting
The main purpose of the frame number setting is to fit the iPhone4, which is the machine that should be eliminated.
int framerate; if ([nsprocessinfo processinfo].processorcount = = 1) {if (self.session cansetsessionpreset:avcapturesess
Ionpresetlow]) {[Self.session setsessionpreset:avcapturesessionpresetlow];
} framerate = 10; }else{if ([self.session cansetsessionpreset:avcapturesessionpreset640x480]) {[Self.session SetSessionP
RESET:AVCAPTURESESSIONPRESET640X480];
} framerate = 30;
} cmtime frameduration = Cmtimemake (1, framerate); if ([_capturedevice Lockforconfiguration:&error]) {_capturedevice.activevideomaxframeduration = FrameDurati
On
_capturedevice.activevideominframeduration = frameduration;
[_capturedevice unlockforconfiguration];
else {NSLog (@ "Videodevice Lockforconfiguration returned error%@", error); }
3.1.3 Video Output settings
Video output settings need to pay attention to the problem is: to set the direction of the videoconnection, so as to ensure that the device rotation when the display is normal.
Video if ([Self.session canaddinput:_videodeviceinput]) {[Self.session Addinput:_videodevicein
Put];
Self.videodeviceinput = _videodeviceinput;
[Self.session Removeoutput:_videodataoutput];
Avcapturevideodataoutput *videooutput = [[Avcapturevideodataoutput alloc] init];
_videodataoutput = Videooutput;
Videooutput.videosettings = @{(ID) Kcvpixelbufferpixelformattypekey: @ (Kcvpixelformattype_32bgra)};
[Videooutput setsamplebufferdelegate:self Queue:_videodataoutputqueue];
Videooutput.alwaysdiscardslatevideoframes = NO;
if ([_session canaddoutput:videooutput]) {[_session addoutput:videooutput]; [_capturedevice addobserver:self forkeypath:@ "Adjustingfocus" Options:nskeyvalueobservingoptionnew Context:
Focusareachangedcontext];
_videoconnection = [Videooutput connectionwithmediatype:avmediatypevideo];
if (_videoconnection.isvideostabilizationsupported) {_videoconnection.preferredvideostabilizationmo
de = Avcapturevideostabilizationmodeauto; } uiinterfaceorientation statusbarorientation = [UIApplication Sharedapplication].statusbarorientati
On
Avcapturevideoorientation initialvideoorientation = avcapturevideoorientationportrait; if (statusbarorientation!= uiinterfaceorientationunknown) {initialvideoorientation = (avcapturevideoorient
ation) statusbarorientation;
} _videoconnection.videoorientation = Initialvideoorientation;
} else{NSLog (@ "Cannot add video input to session");
}
3.1.4 audio settings
Note that in order to not drop frames, you need to place the callback queue of the audio output in the serial queue
Audio Avcapturedevice *audiodevice = [Avcapturedevice Defaultdevicewithmediatype:avmediatypeaudio];
Avcapturedeviceinput *audiodeviceinput = [Avcapturedeviceinput deviceinputwithdevice:audiodevice error:&error];
if (! Audiodeviceinput) {NSLog (@ "Could not create audio device input:%@", error);
} if ([Self.session canaddinput:audiodeviceinput]) {[Self.session addinput:audiodeviceinput];
else {NSLog (@ "Could not add audio device input to");
} avcaptureaudiodataoutput *audioout = [[Avcaptureaudiodataoutput alloc] init]; Put audio on it own queue to ensure this we video processing doesn ' t cause us to drop audio dispatch_queue_t au
Diocapturequeue = Dispatch_queue_create ("Wukong.movieRecorder.audio", dispatch_queue_serial);
[Audioout setsamplebufferdelegate:self Queue:audiocapturequeue]; if ([Self.session CanADdoutput:audioout]) {[Self.session addoutput:audioout];
} _audioconnection = [Audioout Connectionwithmediatype:avmediatypeaudio];
One more thing to note is that the configuration code for the session should be this way
[Self.session beginconfiguration];
... Configuration Code
[Self.session commitconfiguration];
Due to the space problem, I will focus on the following recording code.
3.2 Video Storage
Now we need a callback between Avcapturevideodataoutputsamplebufferdelegate and Avcaptureaudiodataoutputsamplebufferdelegate, Writes audio and video to the sandbox. In this process you need to be aware that the first frame obtained after the start session is black and needs to be discarded.
3.2.1 Create Wkmoviewriter class to encapsulate video storage operations
The main function of Wkmoviewriter is to use Avassetwriter to get cmsamplebufferref, cut and then write to the sandbox.
This is the Trim configuration code, Avassetwriter will tailor the video according to Cropsize, one of the problems to note here is that cropsize width must be an integer multiple of 320, otherwise the cropped video will appear on the right side of a green line
Nsdictionary *videosettings;
if (_cropsize.height = = 0 | | _cropsize.width = 0) {
_cropsize = [UIScreen mainscreen].bounds.size;
}
Videosettings = [nsdictionary dictionarywithobjectsandkeys:
AVVideoCodecH264, Avvideocodeckey,
[NSNumber Numberwithint:_cropsize.width], Avvideowidthkey,
[NSNumber numberwithint:_cropsize.height], AVVideoHeightKey,
Avvideoscalingmoderesizeaspectfill,avvideoscalingmodekey,
Nil];
At this point, the video recording is complete.
Next, we need to solve the preview problem.
Part 2 Carrington Problem Solving
1.1 GIF image generation
By looking up the information found this blog introduction said the micro-letter team to solve the problem preview cotton use is playing picture gif, but the blog sample code has a problem, through the coreanimation to play the picture caused memory to soar and crash. However, it gave me some inspiration, because the previous project's launch page used a GIF image to play, so I think I can turn the video into a picture, and then turn into a GIF to play, so that does not solve the problem. So I began to Google Kung fu to find the image of an array of GIF image methods.
GIF translation Code
static void Makeanimatedgif (Nsarray *images, Nsurl *gifurl, Nstimeinterval duration) {Nstimeinterval Persecond = Durat
Ion/images.count;
Nsdictionary *fileproperties = @{(__bridge ID) kcgimagepropertygifdictionary: @{
(__bridge ID) Kcgimagepropertygifloopcount: @0,//0 means loop forever}};
Nsdictionary *frameproperties = @{(__bridge ID) kcgimagepropertygifdictionary: @{ (__bridge ID) kcgimagepropertygifdelaytime: @ (Persecond),//a float (not double!) in seconds, rounded to Centiseconds
In the GIF data}; Cgimagedestinationref Destination = Cgimagedestinationcreatewithurl ((__bridge cfurlref) GifURL, KUTTypeGIF,
Images.count, NULL);
Cgimagedestinationsetproperties (destination, (__bridge cfdictionaryref) fileproperties); For (UIImage *image in images) {@autoreleasepool {CgimagEdestinationaddimage (destination, image.
Cgimage, (__bridge cfdictionaryref) frameproperties); } if (!
Cgimagedestinationfinalize (destination)) {NSLog (@ "Failed to finalize image destination");
}else{} cfrelease (destination);
}
The conversion was successful, but a new problem has arisen, when using ImageIO to generate GIF images can cause memory to soar, instantly up to 100M, if more than one GIF generated at the same time will be crash out, in order to solve this problem need to use a serial queue for the production of GIF
1.2 Video conversion to Uiimages
Mainly through the Avassetreader, Avassettrack, avassetreadertrackoutput to convert
Convert to UIImage-(void) Convertvideouiimageswithurl: (nsurl *) URL Finishblock: (void (^) (ID images, nstimeinterval duration)
) Finishblock {Avasset *asset = [Avasset Assetwithurl:url];
Nserror *error = nil;
Self.reader = [[Avassetreader alloc] Initwithasset:asset error:&error];
Nstimeinterval Duration = cmtimegetseconds (asset.duration);
__weak typeof (self) weakself = self;
dispatch_queue_t backgroundqueue = dispatch_get_global_queue (dispatch_queue_priority_high, 0);
Dispatch_async (Backgroundqueue, ^{__strong typeof (weakself) strongself = weakself;
NSLog (@ "");
if (Error) {NSLog (@ "%@", [Error localizeddescription]);
} nsarray *videotracks = [Asset Trackswithmediatype:avmediatypevideo];
Avassettrack *videotrack =[videotracks Firstobject];
if (!videotrack) {return;
int m_pixelformattype; When the video is playing, M_pixelformattype = Kcvpixelformattype_32bgra;
Other uses, such as video compression//M_pixelformattype = Kcvpixelformattype_420ypcbcr8biplanarvideorange;
Nsmutabledictionary *options = [Nsmutabledictionary dictionary];
[Options setobject:@ (M_pixelformattype) Forkey: (ID) Kcvpixelbufferpixelformattypekey]; Avassetreadertrackoutput *videoreaderoutput = [[Avassetreadertrackoutput alloc] Initwithtrack:videotrack
Outputsettings:options]; if ([Strongself.reader canaddoutput:videoreaderoutput]) {[Strongself.reader addoutput:videoreaderoutput]
;
} [Strongself.reader startreading];
Nsmutablearray *images = [Nsmutablearray array]; To make sure nominalframerate>0, there have been 0 frames of video from Android while ([strongself.reader status] = = avassetreaderstatusreading & & videotrack.nominalframerate > 0 {@autoreleasepool {//Read the video sample Cmsamplebufferr
EF Videobuffer = [Videoreaderoutput copynextsamplebuffer]; if (!vIdeobuffer) {break;
} [Images Addobject:[wkvideoconverter Convertsamplebufferreftouiimage:videobuffer]];
Cfrelease (Videobuffer); } if (Finishblock) {Dispatch_async (dispatch_get_main_queue), ^{Finishblock
(Images, duration);
});
}
});
}
Here is a noteworthy problem, in the video image of the process, because the conversion time is very short, in a short time Videobuffer can not be released in time, in multiple video conversion when there will be memory problems, This time you need to use Autoreleasepool to achieve timely release
@autoreleasepool {
//Read video sample
cmsamplebufferref videobuffer = [Videoreaderoutput Copynextsamplebuffer] ;
if (!videobuffer) {break
;
}
[Images Addobject:[wkvideoconverter Convertsamplebufferreftouiimage:videobuffer]];
Cfrelease (Videobuffer); }
So far, micro-letter Small video difficult (I think) on the solution, as for other implementation code please see demo on the basic implementation, demo can download from here.
Video Pause Recording http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html
Video crop green Edge solution http://stackoverflow.com/questions/22883525/ Avassetexportsession-giving-me-a-green-border-on-right-and-bottom-of-output-vide
Video clipping: http://stackoverflow.com/questions/15737781/video-capture-with-11-aspect-ratio-in-ios/16910263#16910263
Cmsamplebufferref Turn Image https://developer.apple.com/library/ios/qa/qa1702/_index.html
Micro-Letter Small video analysis Http://www.jianshu.com/p/3d5ccbde0de1
Thanks to the author of the above article
The above is the entire content of this article, I hope to help you learn, but also hope that we support the cloud habitat community.