iOS uses avcapturesession to customize the camera

Source: Internet
Author: User

About iOS calling cameras to get photos, usually we call Uiimagepickercontroller to call the system-supplied camera to take a picture, which is very handy. But sometimes the Uiimagepickercontroller control does not meet our needs, for example we need more complex overlayerview, and we are going to construct a camera control ourselves.

This requires the use of components within the framework of the avfoundation.framework, so we will first import <AVFoundation/AVFoundation.h> this header file, In addition, the official documentation of the required components is said:

An instance of Avcapturedevice to represent the input device, such as a camera or microphone
An instance of a concrete subclass of Avcaptureinput to configure the ports from the input device
An instance of a concrete subclass of Avcaptureoutput to manage the output to a movie file or still image
An instance of avcapturesession to coordinate the data flow from the input to the output


Here I have only built a camera with a photo function, as for the video and recording function is not illustrated here.

To summarize, we need the following objects:

@property (nonatomic, strong)       avcapturesession            * Session;//avcapturesession object to perform data transfer between the input device and the output device @property (Nonatomic, Strong)       Avcapturedeviceinput        * Videoinput;//avcapturedeviceinput object is input stream @property (nonatomic, Strong)       Avcapturestillimageoutput   * stillimageoutput;//Photo Output Stream object, of course, my camera only takes camera function, so only need this object is enough @property (nonatomic, strong )       avcapturevideopreviewlayer  * previewlayer;//preview layer to show the picture taken by the camera @property (nonatomic, Strong)       Uibarbuttonitem             * togglebutton;//switch front and rear lens button @property (nonatomic, strong)       UIButton                    * shutterbutton;// Photo Button @property (nonatomic, strong)       UIView                      

My habit is to create these objects when the Init method executes, and then load the preview layer in the Viewwillappear method. Now let's look at the code and make it clear.

-(void) initialsession{    //The execution of this method I put in the Init method    self.session = [[Avcapturesession alloc] init];    Self.videoinput = [[Avcapturedeviceinput alloc] initwithdevice:[self Frontcamera] error:nil];    The [Self Froncamera] method returns a Avcapturedevice object, because I initialize it with a front camera, so writing this, the specific implementation method is described later in   self.stillimageoutput = [[ Avcapturestillimageoutput alloc] init];    Nsdictionary * outputsettings = [[Nsdictionary alloc] initwithobjectsandkeys:avvideocodecjpeg,avvideocodeckey, nil];    This is the set parameter of the output stream the Avvideocodecjpeg parameter represents the output image in JPEG picture format    [Self.stillimageoutput setoutputsettings:outputsettings];        if ([Self.session canAddInput:self.videoInput]) {        [self.session addInput:self.videoInput];    }    if ([Self.session canAddOutput:self.stillImageOutput]) {        [self.session addOutput:self.stillImageOutput];    }    }
This is the way to get the camera object before and after

-(Avcapturedevice *) Camerawithposition: (avcapturedeviceposition) position {Nsarray *devices = [Avcapturedevice Deviceswithmediatype:avmediatypevideo];for (Avcapturedevice *device in devices) {if ([device position] = = position) { return device;}} return nil;} -(Avcapturedevice *) Frontcamera {return [self camerawithposition:avcapturedevicepositionfront];} -(Avcapturedevice *) Backcamera {return [self camerawithposition:avcapturedevicepositionback];}

Next, execute the method of loading the preview layer in the Viewwillappear method

-(void) setupcameralayer{    if (_cameraavaible = = NO) return;        if (Self.previewlayer = = nil) {        self.previewlayer = [[Avcapturevideopreviewlayer alloc] Initwithsession: Self.session];        UIView * view = Self.camerashowview;        Calayer * Viewlayer = [view layer];        [Viewlayer Setmaskstobounds:yes];                CGRect bounds = [view bounds];        [Self.previewlayer setframe:bounds];        [Self.previewlayer Setvideogravity:avlayervideogravityresizeaspect];                [Viewlayer insertSublayer:self.previewLayer Below:[[viewlayer sublayers] objectatindex:0]];            }}

Note the following methods to start and close the session in the Viewdidappear and Viewdiddisappear methods

-(void) Viewdidappear: (BOOL) animated{    [Super viewdidappear:animated];    if (self.session) {        [self.session startrunning];    }} -(void) Viewdiddisappear: (BOOL) animated{    [Super viewdiddisappear:animated];    if (self.session) {        [self.session stoprunning];    }}

And then we're going to implement the button of the front and back of the switch, button creation I won't say much.

-(void) Togglecamera {Nsuinteger cameracount = [[Avcapturedevice deviceswithmediatype:avmediatypevideo] count];if (Cam        Eracount > 1) {nserror *error;        Avcapturedeviceinput *newvideoinput;                Avcapturedeviceposition position = [[_videoinput device] position];  if (position = = avcapturedevicepositionback) Newvideoinput = [[Avcapturedeviceinput alloc] Initwithdevice:[self        Frontcamera] error:&error]; else if (position = = Avcapturedevicepositionfront) Newvideoinput = [[Avcapturedeviceinput alloc] Initwithdevice        : [Self Backcamera] error:&error];                else return;            if (newvideoinput! = nil) {[self.session beginconfiguration];            [Self.session RemoveInput:self.videoInput];                if ([Self.session canaddinput:newvideoinput]) {[Self.session addinput:newvideoinput];            [Self setvideoinput:newvideoinput]; } else {self.Session AddInput:self.videoInput];        } [Self.session commitconfiguration];        } else if (error) {NSLog (@ "Toggle Carema failed, error =%@", error); }    }}

This is the button method for switching the lens.

-(void) shuttercamera{    avcaptureconnection * videoconnection = [Self.stillimageoutput connectionwithmediatype: Avmediatypevideo];    if (!videoconnection) {        NSLog (@ "Take photo failed!");        return;    }        [Self.stillimageoutput capturestillimageasynchronouslyfromconnection:videoconnection completionHandler:^ ( Cmsamplebufferref Imagedatasamplebuffer, Nserror *error) {        if (Imagedatasamplebuffer = = NULL) {            return;        }        NSData * ImageData = [avcapturestillimageoutput jpegstillimagensdatarepresentation:imagedatasamplebuffer];        UIImage * image = [UIImage imagewithdata:imagedata];        NSLog (@ "Image size =%@", Nsstringfromcgsize (Image.size));    }];}
This is the method of the camera button.

This simple function of customizing the camera is complete, if you want to add other complex functions, you can refer to the following article, I hope to help you.

Http://course.gdou.com/blog/Blog.pzs/archive/2011/12/14/10882.html

iOS uses avcapturesession to customize the camera

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.