Scan QR code/generate QR code in iOS

Source: Internet
Author: User

Scan QR code/generate QR code in iOS

Recently, I have been asking you if you have any demos for QR codes. To meet your needs, I have studied them and hope they can help you!

Indicates the Root View:

 self.window.rootViewController = [[UINavigationController alloc]initWithRootViewController:[SecondViewController new]];

 


Generate QR code:

 

// Created by Han junqiang on 15/11/27. // Copyright (c) 2015 Han junqiang. all rights reserved. // # import "SecondViewController. h "@ interface SecondViewController () @ property (nonatomic, strong) UITextField * tfCode; @ property (nonatomic, strong) UIButton * btnGenerate; @ property (nonatomic, strong) UIImageView * imageView; @ end @ implementation SecondViewController-(void) viewDidLoad {[super viewDidLoad]; CGSize windowSize = [UIScreen mainScreen]. bounds. size; self. tfCode = [[UITextField alloc] initWithFrame: CGRectMake (10, 64, windowSize. width-100, 40)]; [self. view addSubview: self. tfCode]; self. tfCode. borderStyle = UITextBorderStyleRoundedRect; self. btnGenerate = [[UIButton alloc] initWithFrame: CGRectMake (windowSize. width-100, 64, 90, 40)]; [self. view addSubview: self. btnGenerate]; [self. btnGenerate addTarget: self action: @ selector (actionGenerate) forControlEvents: UIControlEventTouchUpInside]; self. btnGenerate. backgroundColor = [UIColor lightGrayColor]; [self. btnGenerate setTitle: @ "generate" forState: UIControlStateNormal]; [self. btnGenerate setTitleColor: [UIColor blackColor] forState: UIControlStateNormal]; self. imageView = [[UIImageView alloc] initWithFrame: CGRectMake (0, 0,300,300)]; [self. view addSubview: self. imageView]; self. imageView. center = CGPointMake (windowSize. width/2, windowSize. height/2); self. tfCode. text = @ "http://www.baidu.com";}-(void) actionGenerate {NSString * text = self. tfCode. text; NSData * stringData = [text dataUsingEncoding: encoding]; // generates CIFilter * qrFilter = [CIFilter filterWithName: @ "CIQRCodeGenerator"]; [qrFilter setValue: stringData forKey: @ "inputMessage"]; [qrFilter setValue: @ "M" forKey: @ "inputCorrectionLevel"]; UIColor * onColor = [UIColor blackColor]; UIColor * offColor = [UIColor whiteColor]; // color CIFilter * colorFilter = [CIFilter filterWithName: @ "CIFalseColor" keysAndValues: @ "inputImage", qrFilter. outputImage, @ "inputColor0", [CIColor colorWithCGColor: onColor. CGColor], @ "inputColor1", [CIColor colorWithCGColor: offColor. CGColor], nil]; CIImage * qrImage = colorFilter. outputImage; // plot CGSize size = CGSizeMake (300,300); CGImageRef cgImage = [[CIContext contextwitexceptions: nil] createCGImage: qrImage fromRect: qrImage. extent]; Aggregate (size); CGContextRef context = aggregate (); CGContextSetInterpolationQuality (context, kCGInterpolationNone); CGContextScaleCTM (context, 1.0,-1.0); CGContextDrawImage (context, context, CGContextGetClipBoundingBox (context), cgImage); UIImage * codeImage = UIGraphicsGetImageFromCurrentImageContext (); UIGraphicsEndImageContext (); CGImageRelease (cgImage); self. imageView. image = codeImage ;}

Scan QR code:

 

 

// Created by Han junqiang on 15/11/27. // Copyright (c) 2015 Han junqiang. all rights reserved. // # import "RootViewController. h "# import @ interface RootViewController () @ property (nonatomic, strong) UIView * scanRectView; // hardware device @ property (strong, nonatomic) AVCaptureDevice * device; // input device @ property (strong, nonatomic) AVCaptureDeviceInput * input; // output device @ property (strong, nonatomic) AVCaptureMetadataOutput * output; // bridge. connection Input and Output devices, @ property (strong, nonatomic) AVCaptureSession * session; @ property (strong, nonatomic) AVCaptureVideoPreviewLayer * preview; @ end @ implementation RootViewController-(void) viewDidLoad {[super viewDidLoad]; CGSize windowSize = [UIScreen mainScreen]. bounds. size; CGSize scanSize = CGSizeMake (windowSize. width * 3/4, windowSize. width * 3/4); CGRect scanRect = CGRectMake (windowSize. width-scanSize.widt H)/2, (windowSize. height-scanSize.height)/2, scanSize. width, scanSize. height); scanRect = CGRectMake (scanRect. origin. y/windowSize. height, scanRect. origin. x/windowSize. width, scanRect. size. height/windowSize. height, scanRect. size. width/windowSize. width); self. device = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo]; self. input = [AVCaptureDeviceInput deviceInputWithDevice: self. device er Ror: nil]; self. output = [[AVCaptureMetadataOutput alloc] init]; [self. output setMetadataObjectsDelegate: self queue: dispatch_get_main_queue ()]; self. session = [[AVCaptureSession alloc] init]; [self. session setSessionPreset :( [UIScreen mainScreen]. bounds. size. height <500 )? AVCaptureSessionPreset640x480: AVCaptureSessionPresetHigh]; [self. session addInput: self. input]; [self. session addOutput: self. output]; self. output. metadataObjectTypes = @ [AVMetadataObjectTypeQRCode]; self. output. rectOfInterest = scanRect; self. preview = [AVCaptureVideoPreviewLayer layerWithSession: self. session]; self. preview. videoGravity = AVLayerVideoGravityResizeAspectFill; self. preview. frame = [UIScreen mainScreen]. bounds; [self. view. layer insertSublayer: self. preview atIndex: 0]; self. scanRectView = [UIView new]; [self. view addSubview: self. scanRectView]; self. scanRectView. frame = CGRectMake (0, 0, scanSize. width, scanSize. height); self. scanRectView. center = CGPointMake (CGRectGetMidX ([UIScreen mainScreen]. bounds), CGRectGetMidY ([UIScreen mainScreen]. bounds); self. scanRectView. layer. borderColor = [UIColor redColor]. CGColor; self. scanRectView. layer. borderWidth = 1; // start to capture [self. session startRunning];}-(void) captureOutput :( AVCaptureOutput *) captureOutput didOutputMetadataObjects :( NSArray *) metadataObjects fromConnection :( AVCaptureConnection *) connection {if (metadataObjects. count = 0) {return;} if (metadataObjects. count> 0) {[self. session stopRunning]; AVMetadataMachineReadableCodeObject * metadataObject = metadataObjects. firstObject; // The output scan string UIAlertView * alert = [[UIAlertView alloc] initWithTitle: metadataObject. stringValue message: @ "" delegate: self cancelButtonTitle: @ "OK" otherButtonTitles: nil]; [alert show] ;}- (void) alertView :( UIAlertView *) alertView willDismissWithButtonIndex :( NSInteger) buttonIndex {[self. session startRunning];}

 

Final effect: (because the scan QR code cannot display the effect, you can test it on your own !)


 

Before IOS7, developers generally use third-party libraries for code scanning. The commonly used ZBarSDK, after IOS7, provides an interface for us to parse the QR code in the AVMetadataObject class of the system. After testing, the use of native API scanning and processing efficiency is very high, far higher than the third-party library.

 

1. Example

 

The official interface is very simple and the code is as follows:

?
123456789101112131415161718192021222324252627282930313233343536 @interface ViewController ()// The proxy used to process the collected information{AVCaptureSession * session;// Intermediate Bridge of Input and Output}@end@implementation ViewController - (void)viewDidLoad {[super viewDidLoad];// Do any additional setup after loading the view, typically from a nib.// Obtain the camera DeviceAVCaptureDevice * device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];// Create an input streamAVCaptureDeviceInput * input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];// Create an output streamAVCaptureMetadataOutput * output = [[AVCaptureMetadataOutput alloc]init];// Set the proxy to refresh in the main thread[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()]; // Initialize the link objectsession = [[AVCaptureSession alloc]init];// High collection rate[session setSessionPreset:AVCaptureSessionPresetHigh]; [session addInput:input];[session addOutput:output];// Set the supported encoding formats (the following settings are compatible with the QR code)output.metadataObjectTypes=@[AVMetadataObjectTypeQRCode,AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeCode128Code]; AVCaptureVideoPreviewLayer * layer = [AVCaptureVideoPreviewLayer layerWithSession:session];layer.videoGravity=AVLayerVideoGravityResizeAspectFill;layer.frame=self.view.layer.bounds;[self.view.layer insertSublayer:layer atIndex:0];// Start capture[session startRunning];}

Then we can see the content captured by the camera on the UI. As long as the proxy method is implemented, we can scan the QR code:

?
12345678 -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{if (metadataObjects.count>0) {//[session stopRunning];AVMetadataMachineReadableCodeObject * metadataObject = [metadataObjects objectAtIndex : 0 ];// Output scan stringNSLog(@"%@",metadataObject.stringValue);}}
Ii. Some Optimizations

Through the code test above, we can find that the resolution processing efficiency of the system is quite high, and the API officially provided by IOS is indeed very powerful. However, we can further optimize it, improve efficiency:

First, the AVCaptureMetadataOutput class has such an attribute (available after IOS7.0 ):

@ Property (nonatomic) CGRect rectOfInterest;

This attribute is generally used to tell the system what area it needs to pay attention to. Most app qr code scanning UIS have a box to remind you to place the barcode in that area. The role of this attribute is here, it can set a range to process only the information of the captured image in this range. As a result, we can imagine that the efficiency of our code will be greatly improved when using this attribute. Note:

1. The CGRect parameter is not in the same range as the common Rect parameter. The value ranges from 0 to 1, indicating the proportion.

2. tests show that x corresponds to the vertical distance from the upper left corner, and y corresponds to the horizontal distance from the upper left corner.

3. The width and height settings are similar.

3. For example, if we want to set the scan processing area to the lower half of the screen

?
1 output.rectOfInterest=CGRectMake(0.5,0,0.5, 1);

The specific reason why apple should be designed like this, or the usage of this parameter is not correct, I also need to know a friend to give a guide.


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.