IOS Study Notes 27-camera

Source: Internet
Author: User
Tags file url

IOS Study Notes 27-camera
I. Cameras

In iOS, mobile phone cameras can be used in the following two ways:
1.UIImagePickerControllerPhoto and video recording
* Advantages: easy to use and powerful functions
* Disadvantage: Highly encapsulated, unable to implement some custom work
2.AVFoundationFramework Implementation
* Advantages: high flexibility, many ready-made input and output devices, and a lot of underlying content for developers
* Disadvantage: You need to deal with the underlying layer, which is difficult to learn and complex to use.

We usually useUIImagePickerControllerIt can basically be satisfied, and the feature is indeed powerful, but it is also not good, that is, because of its high encapsulation, it is more complicated to do some custom work, for example, if you want to make a photo interface similar to a face filter camera, it is difficult to implement it.AVFoundationFramework implementation.

Ii. UIImagePickerController

UIImagePickerControllerInherited fromUINavigationController, BelongUIKitThe framework allows you to select images, take photos, and record videos.

1. common attributes:
@ Property (nonatomic) UIImagePickerControllerSourceType sourceType;/* pick up source type enumeration */typedef NS_ENUM (NSInteger, category) {category, // photo gallery category, // camera photo // album }; /* media type. By default, this array contains the kUTTypeImage, indicating that if you want to take a photo and video, you must set it to the format of the video (without sound) or the format of the video (with sound) * // @ property (nonatomic, copy) NSArray
  
   
* MediaTypes; @ property (nonatomic) NSTimeInterval videoMaximumDuration; // maximum video recording duration. The default value is 10 s @ property (nonatomic) UIImagePickerControllerQualityType videoQuality; // video quality typedef NS_ENUM (NSInteger, duration) {UIImagePickerControllerQualityTypeHigh = 0, // HD duration, // medium, suitable for WiFi transmission UIImagePickerControllerQualityTypeLow, // low quality, suitable for cellular network transmission duration, // 640*480 bytes, // 1280*720 UIImagePickerControllerQualityTypeIFrame960x540, // 960*540}; @ property (nonatomic) BOOL showsCameraControls;/* Whether to display the camera control panel. The default value is YES */@ property (nonatomic, strong) UIView * cameraOverlayView;/* camera-covered view */@ property (nonatomic) CGAffineTransform cameraViewTransform;/* camera deformation */@ property (nonatomic) camera cameraCaptureMode; /* camera capture mode */typedef NS_ENUM (NSInteger, camera) {Camera, // camera mode camera // video recording mode}; @ property (nonatomic) UIImagePickerControllerCameraDevice cameraDevice; /* camera device */typedef NS_ENUM (NSInteger, UIImagePickerControllerCameraDevice) {Camera, // front camera mask/rear camera}; @ property (nonatomic) UIImagePickerControllerCameraFlashMode cameraFlashMode; /* Flashlight mode */typedef NS_ENUM (NSInteger, UIImagePickerControllerCameraFlashMode) {duration =-1, // turn off the flashlight UIImagePickerControllerCameraFlashModeAuto = 0, // automatic flashlight, default UIImagePickerControllerCameraFlashModeOn = 1 // turn on the flashlight };
  
2. Common Object methods:
-(Void) takePicture; // take a photo-(BOOL) startVideoCapture; // start recording a video-(void) stopVideoCapture; // stop recording a video
3. Proxy method:
/* After the media is obtained, the system calls */-(void) imagePickerController :( UIImagePickerController *) picker didFinishPickingMediaWithInfo :( NSDictionary
  
   
*) Info;/* call */-(void) imagePickerControllerDidCancel (UIImagePickerController *) picker if you cancel the call;
  
4. Extended functions for saving to the album:
/* Save the image to the album */void UIImageWriteToSavedPhotosAlbum (UIImage * image, // Save the image UIImage id completionTarget, // The callback executor SEL completionSelector, // callback method void * contextInfo // callback parameter information); // The callback Method for storing the image above is generally:-(void) image :( UIImage *) image didFinishSavingWithError :( NSError *) error contextInfo :( void *) contextInfo;/* determine whether the video can be saved to the album */BOOL UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (NSString * videoPath ); /* Save the video to the album */void UISaveVideoAtPathToSavedPhotosAlbum (NSString * videoPath, // Save the video file path id completionTarget, // The callback executor SEL completionSelector, // callback method void * contextInfo // callback parameter information); // The preceding callback Method for storing videos is:-(void) video :( NSString *) videoPath didFinishSavingWithError :( NSError *) error contextInfo :( void *) contextInfo;
5. Steps for using the camera: Create UIImagePickerControllerThe object specifies the pick-up source. Both the camera and the video must be used to specify the camera device. Either the front or the back must set the media type. The media type is defined in MobileCoreServices.frameworkSpecifies the camera capture mode. You must set the media type before the video capture mode. Display UIImagePickerControllerUsually, after opening a photo or video in the form of a modal pop-up, display or save the photo or video in the proxy method. 6. The following is the specific instance code:
# Import "ViewController. h" # import
  
   
@ Interface ViewController ()
   
    
@ Property (strong, nonatomic) UIImagePickerController * pickerController; // pickup controller @ property (strong, nonatomic) IBOutlet UIImageView * showImageView; // display image @ end @ implementation ViewController-(void) viewDidLoad {[super viewDidLoad]; // initialize the pick-up controller [self initPickerController];}/* initialize the pick-up controller */-(void) initPickerController {// create a pickup controller UIImagePickerController * pickerController = [[UIImagePickerController alloc] init]; // set the pickup source to the camera pickerController. sourceType = UIImagePickerControllerSourceTypeCamera; // set the camera to the backend pickerController. cameraDevice = UIImagePickerControllerCameraDeviceRear; pickerController. editing = YES; // you can click pickerController to edit the settings. delegate = self; // sets proxy self. pickerController = pickerController;} # pragma mark-UI click/* click to take a photo */-(IBAction) imagePicker :( id) sender {// set the media type of the photo self. pickerController. mediaTypes = @ [(NSString *) kUTTypeImage]; // sets the camera capture mode to capture image self. pickerController. cameraCaptureMode = UIImagePickerControllerCameraCaptureModePhoto; // The pick-up controller [self presentViewController: self. pickerController animated: YES completion: nil];}/* click video */-(IBAction) videoPicker :( id) sender {// set the media type of the video self. pickerController. mediaTypes = @ [(NSString *) kUTTypeMovie]; // sets the camera capture mode to capture the video self. pickerController. cameraCaptureMode = UIImagePickerControllerCameraCaptureModeVideo; // set the video quality to HD self. pickerController. videoQuality = UIImagePickerControllerQualityTypeHigh; // The mode displays the pick-up controller [self presentViewController: self. pickerController animated: YES completion: nil] ;}# pragma mark-proxy method/* call */-(void) imagePickerController (UIImagePickerController *) When (NSDictionary
    
     
*) Info {// retrieve the media type NSString * mediaType = [info objectForKey: UIImagePickerControllerMediaType] From info. if ([mediaType isw.tostring :( NSString *) kUTTypeImage]) {// if you are taking a photo, // obtain the image UIImage * image = [info objectForKey: UIImagePickerControllerOriginalImage]; // Save the image to the album UIImageWriteToSavedPhotosAlbum (image, self, @ selector (image: didFinishSavingWithError: contextInfo :), nil);} else if ([mediaType isw.tostring :( NSString *) kUTTypeMovie]) {// video URL * url = [info objectForKey: UIImagePickerControllerMediaURL]; NSString * path = url. path; // determine whether the video can be saved to the album if (paths) {// Save the video to the album UISaveVideoAtPathToSavedPhotosAlbum (path, self, @ selector (video: didFinishSavingWithError: contextInfo :), nil) ;}}// the pick-up controller pops up in [self dismissViewControllerAnimated: YES completion: nil];}/* call */-(void) imagePickerControllerDidCancel :( UIImagePickerController *) picker {NSLog (@ "cancel"); // The pick-up controller Returns [self dismissViewControllerAnimated: YES completion: nil];} # pragma mark-callback after image or video is saved-(void) image :( UIImage *) image didFinishSavingWithError :( NSError *) error contextInfo :( void *) contextInfo {NSLog (@ "Save image completed"); self. showImageView. image = image; self. showImageView. contentMode = success;}-(void) video :( NSString *) videoPath didFinishSavingWithError :( NSError *) error contextInfo :( void *) contextInfo {NSLog (@ "video saved ");} @ end
    
   
  



The function is very powerful, basically meets the general needs, and is easy to use.

3. AVFoundation photo and video first, learn about the categories related to taking photos and videos by AVFoundation: AVCaptureSession:
A media capturing session outputs captured audio and video data to an output device. A session can have multiple inputs and outputs. AVCaptureVideoPervieewLayer:
Camera shot preview layer, yes CALayerTo view the effect of a photo or video in real time.
AVCaptureDevice:
Enter the device, including the microphone and camera. You can set the properties of some physical devices. AVCaptureDeviceInput:
Device input data management object, management input data AVCaptureOutput:
A device outputs data management objects to manage output data. Its subclass is usually used:
AVCaptureAudioDataOutput // output audio management object. The output data is NSDataAVCaptureStillImageDataOutput // output image management object. The output data is NSDataAVCaptureVideoDataOutput // output video management object, the output data is an NSData/* output file management object. The output data is output in file form */AVCaptureFileOutput {// subclass AVCaptureAudioFileOutput // The output is the audio file AVCaptureMovieFileOutput // The output is a video file}

The general steps for taking a photo or recording are: Create AVCaptureSessionObject usage AVCaptureDeviceObtain the device to be used by using the input device. AVCaptureDeviceCreate and initialize AVCaptureDeviceInputInitialize the output data management object to see what data is output and decide which data to use AVCaptureOutputSubclass AVCaptureDeviceInput, AVCaptureOutputAdd to media session management object AVCaptureSessionCreate a video preview Layer AVCaptureVideoPreviewLayerSpecify a media session and add a layer to the display container to call a media session. AVCaptureSessionOf startRunningMethod To start capturing, stopRunningMethod To stop capturing and output captured audio or video data to a specified file. The following is an example:
# Import "ViewController. h "# import @ interface ViewController () @ property (strong, nonatomic) AVCaptureSession * session; // media management session @ property (strong, nonatomic) AVCaptureDeviceInput * captureInput; // input data object @ property (strong, nonatomic) AVCaptureStillImageOutput * imageOutput; // output data object @ property (strong, nonatomic) AVCaptureVideoPreviewLayer * captureLayer; // video preview layer @ property (strong, nonatomic) IBOutlet UIButto N * captureBtn; // The photo button @ property (strong, nonatomic) IBOutlet UIButton * openCaptureBtn; // open the camera button @ end @ implementation ViewController-(void) viewDidLoad {[super viewDidLoad]; [self initCapture]; self. openCaptureBtn. hidden = NO; self. captureBtn. hidden = YES;}/* initialize the camera */-(void) initCapture {// 1. create a media management session AVCaptureSession * session = [[AVCaptureSession alloc] init]; self. session = session; // determine whether the resolution supports 1280*720. It can be set to 1280*720 if ([session canSetSessionPreset: AVCaptureSessionPreset1280x720]) {session. sessionPreset = AVCaptureSessionPreset1280x720;} // 2. obtain the rear camera device object AVCaptureDevice * device = nil; NSArray * cameras = [AVCaptureDevice devicesWithMediaType: AVMediaTypeVideo]; for (AVCaptureDevice * camera in cameras) {if (camera. position = AVCaptureDevicePositionBack) {// obtain the rear camera device = camera;} if (! Device) {NSLog (@ "get rear camera error"); return ;}// 3. create an input data object NSError * error = nil; AVCaptureDeviceInput * captureInput = [[AVCaptureDeviceInput alloc] initWithDevice: device error: & error]; if (error) {NSLog (@ "error in creating Input Data Objects"); return;} self. captureInput = captureInput; // 4. create an output data object named * imageOutput = [[AVCaptureStillImageOutput alloc] init]; NSDictionary * setting =@{ AVVideoCodecKey: AVVideoCodecJPEG}; [imageOutput setOutputSettings: setting]; self. imageOutput = imageOutput; // 5. add the input and output data objects to the session if ([session canAddInput: captureInput]) {[session addInput: captureInput];} if ([session canAddOutput: imageOutput]) {[session addOutput: imageOutput];} // 6. create a video preview layer AVCaptureVideoPreviewLayer * videoLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession: session]; self. view. layer. masksToBounds = YES; videoLayer. frame = self. view. bounds; videoLayer. videoGravity = AVLayerVideoGravityResizeAspectFill; // insert a layer under the photo button [self. view. layer insertSublayer: videoLayer below: self. captureBtn. layer]; self. captureLayer = videoLayer;} # pragma mark-UI click/* click the photo button */-(IBAction) takeCapture :( id) sender {// obtain the connection AVCaptureConnection * connection = [self. imageOutput connectionWithMediaType: AVMediaTypeVideo]; // obtain the output data of the device through a connection [self. imageOutput handle: connection completionHandler: ^ (CMSampleBufferRef imageDataSampleBuffer, NSError * error) {// obtain the output jpg image data NSData * imageData = [AVCaptureStillImageOutput handle: imageDataSampleBuffer]; UIImage * image = [UIImage imageWithData: imageData]; UIImageWriteToSavedPhotosAlbum (image, nil); // save it to the album self. captureLayer. hidden = YES; self. captureBtn. hidden = YES; self. openCaptureBtn. hidden = NO; [self. session stopRunning]; // stop capturing}];}/* click the camera button */-(IBAction) openCapture :( id) sender {self. captureLayer. hidden = NO; self. captureBtn. hidden = NO; self. openCaptureBtn. hidden = YES; [self. session startRunning]; // start capturing} @ end

The operation of video recording is similar. The following Code is based on the above Code: An audio input is more than the image, to change the class of the output data object, you need to process the video output proxy method recording on the output data object. obtains audio input data objects and video output data objects.
// Obtain the microphone device object AVCaptureDevice * device = [AVCaptureDevice devicesWithMediaType: AVMediaTypeAudio]. firstObject; if (! Device) {NSLog (@ "Get microphone error"); return;} // create the input data object NSError * error = nil; AVCaptureDeviceInput * audioInput = [[AVCaptureDeviceInput alloc] initWithDevice: device error: & error]; if (error) {NSLog (@ "error in creating Input Data Object"); return ;} // create the video file output object AVCaptureMovieFileOutput * movieOutput = [[AVCaptureMovieFileOutput alloc] init]; self. movieOutput = movieOutput;
2. Add to media management session
If ([session canAddInput: captureInput]) {[session addInput: captureInput]; [session addInput: audioInput]; // Add the anti-jitter function AVCaptureConnection * connection = [movieOutput connectionWithMediaType if ([connection isVideoStabilizationSupported]) {connection. preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;} if ([session canAddOutput: movieOutput]) {[session addOutput: movieOutput];}
3. Click the video button.
If (! Self. movieOutput. isRecording) {NSString * outputPath = [NSTemporaryDirectory () stringByAppendingString: @ "myMovie. mov "]; NSURL * url = [NSURL fileURLWithPath: outputPath]; // remember that it is a File URL, not a common URL // start recording and set the agent to monitor the recording process, recording files are stored in the specified URL path [self. movieOutput startRecordingToOutputFileURL: url recordingDelegate: self];} else {[self. movieOutput stopRecording]; // End recording}
4. process the recording Agent AVCaptureFileOutputRecordingDelegate
/* Starts recording and calls */-(void) captureOutput :( AVCaptureFileOutput *) captureOutput didStartRecordingToOutputFileAtURL :( NSURL *) fileURL fromConnections :( NSArray *) connections {NSLog (@ "Start recording");}/* after recording is complete, the system calls */-(void) captureOutput :( AVCaptureFileOutput *) captureOutput didFinishRecordingToOutputFileAtURL :( NSURL *) outputFileURL fromConnections :( NSArray *) connections error :( NSError *) error {NSLog (@ "Recording completed"); NSString * path = outputFileURL. path; // Save the recorded video to the album if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (path) {UISaveVideoAtPathToSavedPhotosAlbum (path, nil );}}
Iv. iOS audio and video usage Summary


Not all of the above tables are described.AVFoundationThere is a lot of content to learn in the framework. This framework is very powerful and can be further studied if you have time.
IOS provides flexible and complete multimedia support. The above table is for reference only.

Code Demo click here: CaptureDemo in LearnDemo

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.