iOS中 掃描二維碼/產生二維碼詳解

來源:互聯網
上載者:User

iOS中 掃描二維碼/產生二維碼詳解

最近大家總是問我有沒有關於二維碼的demo,為了滿足大家的需求,特此研究了一番,希望能幫到大家!

指示根視圖:

 self.window.rootViewController = [[UINavigationController alloc]initWithRootViewController:[SecondViewController new]];

 


產生二維碼:

 

//  Created by 韓俊強 on 15/11/27.//  Copyright (c) 2015年 韓俊強. All rights reserved.//#import "SecondViewController.h"@interface SecondViewController ()@property (nonatomic, strong) UITextField *tfCode;@property (nonatomic, strong) UIButton *btnGenerate;@property (nonatomic, strong) UIImageView *imageView;@end@implementation SecondViewController- (void)viewDidLoad {    [super viewDidLoad];    CGSize windowSize = [UIScreen mainScreen].bounds.size;        self.tfCode = [[UITextField alloc] initWithFrame:CGRectMake(10, 64, windowSize.width-100, 40)];    [self.view addSubview:self.tfCode];    self.tfCode.borderStyle = UITextBorderStyleRoundedRect;        self.btnGenerate = [[UIButton alloc] initWithFrame:CGRectMake(windowSize.width-100, 64, 90, 40)];    [self.view addSubview:self.btnGenerate];    [self.btnGenerate addTarget:self action:@selector(actionGenerate) forControlEvents:UIControlEventTouchUpInside];    self.btnGenerate.backgroundColor = [UIColor lightGrayColor];    [self.btnGenerate setTitle:@"產生" forState:UIControlStateNormal];    [self.btnGenerate setTitleColor:[UIColor blackColor] forState:UIControlStateNormal];        self.imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 300, 300)];    [self.view addSubview:self.imageView];    self.imageView.center = CGPointMake(windowSize.width/2, windowSize.height/2);        self.tfCode.text = @"http://www.baidu.com";}- (void)actionGenerate{    NSString *text = self.tfCode.text;        NSData *stringData = [text dataUsingEncoding: NSUTF8StringEncoding];        //產生    CIFilter *qrFilter = [CIFilter filterWithName:@"CIQRCodeGenerator"];    [qrFilter setValue:stringData forKey:@"inputMessage"];    [qrFilter setValue:@"M" forKey:@"inputCorrectionLevel"];        UIColor *onColor = [UIColor blackColor];    UIColor *offColor = [UIColor whiteColor];        //上色    CIFilter *colorFilter = [CIFilter filterWithName:@"CIFalseColor" keysAndValues:@"inputImage",qrFilter.outputImage,@"inputColor0",[CIColor colorWithCGColor:onColor.CGColor],@"inputColor1",[CIColor colorWithCGColor:offColor.CGColor],nil];        CIImage *qrImage = colorFilter.outputImage;        //繪製    CGSize size = CGSizeMake(300, 300);    CGImageRef cgImage = [[CIContext contextWithOptions:nil] createCGImage:qrImage fromRect:qrImage.extent];    UIGraphicsBeginImageContext(size);    CGContextRef context = UIGraphicsGetCurrentContext();    CGContextSetInterpolationQuality(context, kCGInterpolationNone);    CGContextScaleCTM(context, 1.0, -1.0);    CGContextDrawImage(context, CGContextGetClipBoundingBox(context), cgImage);    UIImage *codeImage = UIGraphicsGetImageFromCurrentImageContext();    UIGraphicsEndImageContext();        CGImageRelease(cgImage);        self.imageView.image = codeImage;}

掃描二維碼:

 

 

//  Created by 韓俊強 on 15/11/27.//  Copyright (c) 2015年 韓俊強. All rights reserved.//#import "RootViewController.h"#import @interface RootViewController ()@property (nonatomic, strong) UIView *scanRectView;// 硬體裝置@property (strong, nonatomic) AVCaptureDevice            *device;//輸入裝置@property (strong, nonatomic) AVCaptureDeviceInput       *input;//輸出裝置@property (strong, nonatomic) AVCaptureMetadataOutput    *output;//橋樑.串連輸入和輸出裝置,@property (strong, nonatomic) AVCaptureSession           *session;@property (strong, nonatomic) AVCaptureVideoPreviewLayer *preview;@end@implementation RootViewController- (void)viewDidLoad {    [super viewDidLoad];    CGSize windowSize = [UIScreen mainScreen].bounds.size;        CGSize scanSize = CGSizeMake(windowSize.width*3/4, windowSize.width*3/4);    CGRect scanRect = CGRectMake((windowSize.width-scanSize.width)/2, (windowSize.height-scanSize.height)/2, scanSize.width, scanSize.height);        scanRect = CGRectMake(scanRect.origin.y/windowSize.height, scanRect.origin.x/windowSize.width, scanRect.size.height/windowSize.height,scanRect.size.width/windowSize.width);        self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];        self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];        self.output = [[AVCaptureMetadataOutput alloc]init];    [self.output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];        self.session = [[AVCaptureSession alloc]init];    [self.session setSessionPreset:([UIScreen mainScreen].bounds.size.height<500)?AVCaptureSessionPreset640x480:AVCaptureSessionPresetHigh];    [self.session addInput:self.input];    [self.session addOutput:self.output];    self.output.metadataObjectTypes=@[AVMetadataObjectTypeQRCode];    self.output.rectOfInterest = scanRect;        self.preview = [AVCaptureVideoPreviewLayer layerWithSession:self.session];    self.preview.videoGravity = AVLayerVideoGravityResizeAspectFill;    self.preview.frame = [UIScreen mainScreen].bounds;    [self.view.layer insertSublayer:self.preview atIndex:0];        self.scanRectView = [UIView new];    [self.view addSubview:self.scanRectView];    self.scanRectView.frame = CGRectMake(0, 0, scanSize.width, scanSize.height);    self.scanRectView.center = CGPointMake(CGRectGetMidX([UIScreen mainScreen].bounds), CGRectGetMidY([UIScreen mainScreen].bounds));    self.scanRectView.layer.borderColor = [UIColor redColor].CGColor;    self.scanRectView.layer.borderWidth = 1;            //開始捕獲    [self.session startRunning];    }- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{    if ( (metadataObjects.count==0) )    {        return;    }        if (metadataObjects.count>0) {                [self.session stopRunning];                AVMetadataMachineReadableCodeObject *metadataObject = metadataObjects.firstObject;        //輸出掃描字串                UIAlertView *alert = [[UIAlertView alloc] initWithTitle:metadataObject.stringValue message:@"" delegate:self cancelButtonTitle:@"ok" otherButtonTitles: nil];                [alert show];    }}- (void)alertView:(UIAlertView *)alertView willDismissWithButtonIndex:(NSInteger)buttonIndex{    [self.session startRunning];}

 

最終效果:(由於掃描二維碼無法展示效果,所以自己動手真機測試吧!)


 

IOS7之前,開發人員進行掃碼編程時,一般會藉助第三方庫。常用的是ZBarSDK,IOS7之後,系統的AVMetadataObject類中,為我們提供瞭解析二維碼的介面。經過測試,使用原生API掃描和處理的效率非常高,遠遠高於第三方庫。

 

一、使用方法樣本

 

官方提供的介面非常簡單,代碼如下:

?
123456789101112131415161718192021222324252627282930313233343536 @interface ViewController ()//用於處理採集資訊的代理{AVCaptureSession * session;//輸入輸出的中間橋樑}@end@implementation ViewController - (void)viewDidLoad {[super viewDidLoad];// Do any additional setup after loading the view, typically from a nib.//擷取攝像裝置AVCaptureDevice * device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];//建立輸入資料流AVCaptureDeviceInput * input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];//建立輸出資料流AVCaptureMetadataOutput * output = [[AVCaptureMetadataOutput alloc]init];//設定代理 在主線程裡重新整理[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()]; //初始化連結化物件session = [[AVCaptureSession alloc]init];//高品質採集率[session setSessionPreset:AVCaptureSessionPresetHigh]; [session addInput:input];[session addOutput:output];//設定掃碼支援的編碼格式(如下設定條碼和二維碼相容)output.metadataObjectTypes=@[AVMetadataObjectTypeQRCode,AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeCode128Code]; AVCaptureVideoPreviewLayer * layer = [AVCaptureVideoPreviewLayer layerWithSession:session];layer.videoGravity=AVLayerVideoGravityResizeAspectFill;layer.frame=self.view.layer.bounds;[self.view.layer insertSublayer:layer atIndex:0];//開始捕獲[session startRunning];}

之後我們的UI上已經可以看到網路攝影機捕獲的內容,只要實現代理中的方法,就可以完成二維碼條碼的掃描:

?
12345678 -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{if (metadataObjects.count>0) {//[session stopRunning];AVMetadataMachineReadableCodeObject * metadataObject = [metadataObjects objectAtIndex : 0 ];//輸出掃描字串NSLog(@"%@",metadataObject.stringValue);}}
二、一些最佳化

通過上面的代碼測試,我們可以發現系統的解析處理效率是相當的高,IOS官方提供的API也確實非常強大,然而,我們可以做進一步的最佳化,將效率更加提高:

首先,AVCaptureMetadataOutput類中有一個這樣的屬性(在IOS7.0之後可用):

@property(nonatomic) CGRect rectOfInterest;

這個屬性大致意思就是告訴系統它需要注意的地區,大部分APP的掃碼UI中都會有一個框,提醒你將條碼放入那個地區,這個屬性的作用就在這裡,它可以設定一個範圍,只處理在這個範圍內捕獲到的映像的資訊。如此一來,可想而知,我們代碼的效率又會得到很大的提高,在使用這個屬性的時候。需要幾點注意:

1、這個CGRect參數和普通的Rect範圍不太一樣,它的四個值的範圍都是0-1,表示比例。

2、經過測試發現,這個參數裡面的x對應的恰恰是距離左上方的垂直距離,y對應的是距離左上方的水平距離。

3、寬度和高度設定的情況也是類似。

3、舉個例子如果我們想讓掃描的處理地區是螢幕的下半部分,我們這樣設定

?
1 output.rectOfInterest=CGRectMake(0.5,0,0.5, 1);

具體apple為什麼要設計成這樣,或者是這個參數我的用法那裡不對,還需要瞭解的朋友給個指導。


相關文章

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.