Opencv for iOS Study Notes (3)-camera

Source: Internet
Author: User

Original address: opencv for iOS Study Notes (3)-camera


Video capturing and Ar visualization are indispensable for augmented reality applications.


In the video capture phase, frames are received from the camera and then transmitted to the processing pipeline for simple operations (such as color conversion. Since the processing of each frame is critical to Ar applications, ensuring the processing efficiency of this process is crucial. The best way to achieve maximum performance is to directly access the frames obtained from the camera.


For example, avcapturevideopreviewlayer and uigetscreenimage can only be used in IOS 3 and earlier versions.

Apple has abandoned them for two main reasons:


1. frames are not directly retrieved from the camera. In order to obtain a bitmap, the program has to create an intermediate variable, uiimage, and then assign the image to it, which is returned as a return value. This is quite irrational for applications such as ar that require frequent frame processing!

2. To draw an AR image, we have to add a transparent layer (transparent overlay view) to render the AR. However, according to Apple's guiding principles (Apple guidelines), we should avoid using non-opaque layers, because it is difficult for the mobile setting processor to render it.

Of course, we also have an effective and high-performance video capture method: avfoundation

Avcapturedevice. This is an abstract hardware device.

Avcaptureinput. The input device (which can be its subclass) is used to configure the port of the abstract hardware device.

Avcaptureoutput. It represents the output data and manages the output to a movie or image.

Avcapturesession. It serves as a bridge between input and output. It coordinates the data transmission from intput to output.

Key code for starting a camera:

#import <Foundation/Foundation.h>#import <CoreMedia/CoreMedia.h>#import <CoreVideo/CoreVideo.h>#include "BGRAVideoFrame.h"@protocol VideoSourceDelegate <NSObject>-(void)frameReady:(struct BGRAVideoFrame) frame;@end@interface VideoSource : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate>{    }@property (nonatomic,retain) AVCaptureSession *captureSession;@property (nonatomic,retain) AVCaptureDeviceInput *deviceInput;@property (nonatomic,assign) id <VideoSourceDelegate> delegate;- (bool) startWithDevicePosition:(AVCaptureDevicePosition)devicePosition;//- (CameraCalibration) getCalibration;//- (CGSize) getFrameSize;@end

# Import "videosource. H "@ implementation videosource @ synthesize capturesession, deviceinput, Delegate;-(void) dealloc {[capturesession release]; [deviceinput release]; self. delegate = nil; [Super dealloc];}-(ID) Init {If (Self = [Super init]) {capturesession = [[avcapturesession alloc] init]; if ([capturesession cansetsessionpreset: avcapturesessionpreset640x480]) {[capturesession setsessionpreset: AVC Response]; nslog (@ "set capture session preset response");} else if ([capturesession cansetsessionpreset: avcapturesessionpresetlow]) {[capturesession setsessionpreset: avcapturesessionpresetlow]; nslog (@ "set capture session preset avcapturesessionpresetlow");} return self;} // external call, Start camera-(bool) startwithdeviceposition :( avcapturedeviceposition) deviceposition {Avcapturedevice * Device = [self camerawithposition: deviceposition]; If (! Device) return false; nserror * error = nil; avcapturedeviceinput * input = [avcapturedeviceinput deviceinputwithdevice: Device error: & error]; self. deviceinput = input; If (! Error) // No error occurred during initialization {If ([[self capturesession] canaddinput: Self. deviceinput]) {[[self capturesession] addinput: Self. deviceinput];} else {nslog (@ "couldn't add video input"); Return false ;}} else {nslog (@ "couldn't create video input "); return false;} // Add the output [self addrawviewoutput]; // start video capture [capturesession startrunning]; return true;} // get the camera-(avcapturedevice *) camerawithposition :( avcapturedeviceposition) position {nsarray * devices = [avcapturedevice deviceswithmediatype: avmediatypevideo]; for (avcapturedevice * Device in devices) {If ([device position] = position) {return device ;}} return nil;} // Add the output-(void) addrawviewoutput {avcapturevideodataoutput * output = [[avcapturevideodataoutput alloc] init]; // process only one frame at a time; otherwise no output. alwaysdiscardslatevideoframes = yes; // create the operation queue dispatch_queue_t queue; queue = dispatch_queue_create ("com. lanhaijiye ", nil); [Output setsamplebufferdelegate: Self queue: queue]; dispatch_release (Queue); nsstring * keystring = (nsstring *) handle; nsnumber * value = [nsnumber limit: kcvpixelformattype_32bgra]; nsdictionary * setting = [nsdictionary dictionarywithobject: Value forkey: keystring]; [Output setvideosettings: setting]; If ([self. capturesession canaddoutput: output]) {[self. capturesession addoutput: output] ;}//-(cameracalibration) getcalibration // {////} //-(cgsize) getframesize // {//} # pragma-mark avcaptureoutput delegate-(void) captureoutput :( avcaptureoutput *) captureoutput didoutputsamplebuffer :( partial) samplebuffer fromconnection :( avcaptureconnection *) connection {cvimagebufferref imagebuffer = bytes (samplebuffer); // apply a lock to the image cvpixelbufferlockbaseaddress (imagebuffer, 0); uint8_t * baseaddress = (uint8_t *) bytes (imagebuffer ); size_t width = cvpixelbuffergetwidth (imagebuffer); size_t Height = cvpixelbuffergetheight (imagebuffer); size_t stride = gradient (imagebuffer); bgravideoframe = {width, height, stride, baseaddress }; if (delegate & [Delegate respondstoselector: @ selector (frameready :)]) {[Delegate frameready: frame];} // unlock cvpixelbufferunlockbaseaddress (imagebuffer, 0);} @ end

Start the camera:

Videosource * Source = [[videosource alloc] init]; If ([Source startwithdeviceposition: avcapturedevicepositionfront]) {nslog (@ "camera started successfully"); [Source setdelegate: Self];}

Camera output callback:

- (void)frameReady:(struct BGRAVideoFrame)frame{    NSLog(@"file:%s method:%@",__FILE__,NSStringFromSelector(_cmd));}

Struct of each frame:

struct BGRAVideoFrame{    size_t width;    size_t height;    size_t stride;        unsigned char *data;};

Note: # import <avfoundation/avfoundation. h>


Link:

Http://blog.163.com/chester_lp/blog/static/139794082012119112834437/

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.