本文原始地址:OpenCV for Ios 學習筆記(3)-camera
視頻捕獲和AR可視化對於增強現實應用是不可或缺的。
視頻捕獲階段主要包括從裝置相機上接收幀,然後進行簡單的操作(如色彩轉換),把幀傳遞給處理管道。因為對每個幀的處理對AR應用相當關鍵,因此確保該過程的處理效率至關重要。最好達到最大效能的方式是直接存取從相機擷取幀。
比如說,AVCaptureVideoPreviewLayer和UIGetScreenImage,這兩個類只能在IOS 3及以前使用。
蘋果之所以拋棄它們,有兩個主要原因:
1.沒有直接從相機擷取幀。因為為了獲得一個位元影像(bitmap),程式不得不建立一個中間變數UIImage,然後把映像賦給它,作為傳回值返回。這樣對於AR這種需要頻繁處理幀的應用相當不理智!
2.為了繪製一個AR畫面,我們不得不添加一個透明的圖層(transparent overlay view)去呈現AR。但是根據蘋果的指導原則(apple guidelines),我們應該避免使用透明圖層(non-opaque layers),因為對於移動設定的處理器很難在它上面進行渲染。
當然,我們也有一個有效高效能的視頻捕獲方式:AVFoundation
AVCaptureDevice。這裡代表抽象的硬體裝置。
AVCaptureInput。這裡代表輸入裝置(可以是它的子類),它配置抽象硬體裝置的連接埠。
AVCaptureOutput。它代表輸出資料,管理著輸出到一個movie或者映像。
AVCaptureSession。它是input和output的橋樑。它協調著intput到output的資料轉送。
啟動相機關鍵代碼:
#import <Foundation/Foundation.h>#import <CoreMedia/CoreMedia.h>#import <CoreVideo/CoreVideo.h>#include "BGRAVideoFrame.h"@protocol VideoSourceDelegate <NSObject>-(void)frameReady:(struct BGRAVideoFrame) frame;@end@interface VideoSource : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate>{ }@property (nonatomic,retain) AVCaptureSession *captureSession;@property (nonatomic,retain) AVCaptureDeviceInput *deviceInput;@property (nonatomic,assign) id <VideoSourceDelegate> delegate;- (bool) startWithDevicePosition:(AVCaptureDevicePosition)devicePosition;//- (CameraCalibration) getCalibration;//- (CGSize) getFrameSize;@end
#import "VideoSource.h"@implementation VideoSource@synthesize captureSession,deviceInput,delegate;- (void)dealloc{ [captureSession release]; [deviceInput release]; self.delegate = nil; [super dealloc];}- (id)init{ if (self = [super init]) { captureSession = [[AVCaptureSession alloc] init]; if ([captureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) { [captureSession setSessionPreset:AVCaptureSessionPreset640x480]; NSLog(@"Set capture session preset AVCaptureSessionPreset640x480"); }else if ([captureSession canSetSessionPreset:AVCaptureSessionPresetLow]) { [captureSession setSessionPreset:AVCaptureSessionPresetLow]; NSLog(@"Set capture session preset AVCaptureSessionPresetLow"); } } return self;}//外部調用,啟動相機- (bool) startWithDevicePosition:(AVCaptureDevicePosition)devicePosition{ AVCaptureDevice *device = [self cameraWithPosition:devicePosition]; if (!device) return FALSE; NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; self.deviceInput = input; if (!error)//初始化沒有發生錯誤 { if ([[self captureSession] canAddInput:self.deviceInput]) { [[self captureSession] addInput:self.deviceInput]; }else { NSLog(@"Couldn't add video input"); return FALSE; } }else { NSLog(@"Couldn't create video input"); return FALSE; } //添加輸出 [self addRawViewOutput]; //開始視頻捕捉 [captureSession startRunning]; return TRUE;}//擷取相機- (AVCaptureDevice *) cameraWithPosition:(AVCaptureDevicePosition) position{ NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *device in devices) { if ([device position] == position) { return device; } } return nil;}//添加輸出- (void)addRawViewOutput{ AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init]; //同一時間只處理一幀,否則no output.alwaysDiscardsLateVideoFrames = YES; //建立操作隊列 dispatch_queue_t queue; queue = dispatch_queue_create("com.lanhaijiye", nil); [output setSampleBufferDelegate:self queue:queue]; dispatch_release(queue); NSString *keyString = (NSString *)kCVPixelBufferPixelFormatTypeKey; NSNumber *value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; NSDictionary *setting = [NSDictionary dictionaryWithObject:value forKey:keyString]; [output setVideoSettings:setting]; if ([self.captureSession canAddOutput:output]) { [self.captureSession addOutput:output]; }}//- (CameraCalibration) getCalibration//{// //}////- (CGSize) getFrameSize//{// //}#pragma -mark AVCaptureOutput delegate- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{ CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); //給映像加把鎖 CVPixelBufferLockBaseAddress(imageBuffer, 0); uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); size_t stride = CVPixelBufferGetBytesPerRow(imageBuffer); BGRAVideoFrame frame = {width,height,stride,baseAddress}; if (delegate && [delegate respondsToSelector:@selector(frameReady:)]) { [delegate frameReady:frame]; } //解鎖 CVPixelBufferUnlockBaseAddress(imageBuffer,0);}@end
啟動相機:
VideoSource *source = [[VideoSource alloc] init]; if([source startWithDevicePosition:AVCaptureDevicePositionFront]) { NSLog(@"啟動相機成功"); [source setDelegate:self]; }
相機輸出回調:
- (void)frameReady:(struct BGRAVideoFrame)frame{ NSLog(@"file:%s method:%@",__FILE__,NSStringFromSelector(_cmd));}
每一幀的結構體:
struct BGRAVideoFrame{ size_t width; size_t height; size_t stride; unsigned char *data;};
備忘:#import<AVFoundation/AVFoundation.h>
連結:
http://blog.163.com/chester_lp/blog/static/139794082012119112834437/