The earlier interview was asked what the scanning use of the Zbar, and then asked why not the original, more efficient ah, saying that just started to achieve, do not seek efficiency, also do not know how to answer. Now in retrospect, the best answer is: Need to scan the album, the original implementation is not AH.
Zbar Advantages: You can achieve photo scanning of albums But the efficiency is a little low.
Native: High efficiency, but unable to do photo album scanning.
Put the code below
A little bit of optimization.
Mainly reflected in, the first to sweep a sweep VC start sweep of the code placed in the
-(void) Viewdidappear: (BOOL) animated, easy to first enter in the start, will not give the feeling of the card. Then enter this page to set the background to black, when sweeping start OK when the background device is white, there are waiting for him to wait for the time of the animation.
iOS Development Settings Rectofinterest not work
The next thing to say is sweeping the area limits.
First of all, we all know the basics.
The properties of the Avcapturemetadataoutput object rectofinterest
Output.rectofinterest=cgrectmake (100/height, (Width/2 -110)/width, 220/height, 220/width);// Width refers to the height of the Avcapturevideopreviewlayer object that refers to the Avcapturevideopreviewlayer object.
CGRectMake (Y/deviceheight, X/devicewidth, Height/deviceheight, width/devicewidth);
Explain the following cgrectmake: All the x y width height of the interchange. The frame you swept is the starting point coordinates x y width to width height for height, deviceheight, devicewidth refers to Avcapturevi Deopreviewlayer the height of the object but when it is written, the data has been tested more than once. Most of the width is equal to the height, but the proportions here are not equal, because the phone is wide and varied. Okay, let's go directly to the code. The explanation is not in place in the combination of code to see it.
. h
#import <UIKit/UIKit.h>
@interface Scancodeviewcontroller:uiviewcontroller
@property (Nonatomic,strong) NSString *framewhere;
@property (strong,nonatomic) void (^getsysstring) (nsstring*);
@end
. m
#import "ScanCodeViewController.h"
#import <AVFoundation/AVFoundation.h>
#define Widthmaincontrol [UIScreen mainscreen].bounds.size.width
#define Heightmaincontrol [UIScreen mainscreen].bounds.size.height
#define COLORMAINBG [Uicolor colorwithred:253/255.0 green:199/255.0 blue:117/255.0 alpha:1]
@interface Scancodeviewcontroller () <AVCaptureMetadataOutputObjectsDelegate>
{
Avcapturesession * session;//Intermediate bridge for input and output
int starty;
int topheight;
BOOL Oncescan;
}
@property (Strong, nonatomic) Uiimageview *lineimg;
@property (Nonatomic,strong) Nstimer *scanlinetimer;
@end
@implementation Scancodeviewcontroller
-(void) Viewdidload {
[Super Viewdidload];
Self.title = @ "Sweep sweep";
Self.view.backgroundColor = [Uicolor blackcolor];
Oncescan = YES;
int screenheigth = (int) (Heightmaincontrol);
Topheight = 100;
if (screenheigth = = 480) {
Topheight = 50;
}else if (screenheigth = = 568) {
Topheight = 60;
}
}
-(void) Makeui
{
Self.view.backgroundColor = [Uicolor Whitecolor];
Uiimageview *kaungimg = [[Uiimageview alloc] Initwithframe:cgrectmake (SELF.VIEW.FRAME.SIZE.WIDTH/2 -110, TopHeight, 220, 220)];
Kaungimg.image = [UIImage imagenamed:@ "Sys-k"];
[Self.view addsubview:kaungimg];
Self.lineimg = [[Uiimageview alloc] Initwithframe:cgrectmake (SELF.VIEW.FRAME.SIZE.WIDTH/2 -100, TopHeight + 10, 200, 7)] ;
Self.lineImg.image = [UIImage imagenamed:@ "Sys-ht"];
[Self.view ADDSUBVIEW:SELF.LINEIMG];
for (int i=0; i< 4; i++) {
UIView *leftview = [[UIView alloc] Initwithframe:cgrectmake (WIDTHMAINCONTROL/2-0, WIDTHMAINCONTROL/2-110, 100)];
if (i = = 0) {
Leftview.frame = CGRectMake (0, 0, WIDTHMAINCONTROL/2-A, heightMainControl-64 + 64);
}else if (i = = 1) {
Leftview.frame = CGRectMake (WIDTHMAINCONTROL/2-0, (+), topheight);
}else if (i = = 2) {
Leftview.frame = CGRectMake (WIDTHMAINCONTROL/2-+ +, 0, WIDTHMAINCONTROL/2-64, heightMainControl-64 + +);
}else if (i = = 3) {
Leftview.frame = CGRectMake (WIDTHMAINCONTROL/2-up, Topheight +, 64) ;
}
Leftview.backgroundcolor = [Uicolor blackcolor];
Leftview.alpha = 0.4;
[Self.view Addsubview:leftview];
}
Manually enter the Device ID button
UIButton *hangbtn = [UIButton buttonwithtype:uibuttontyperoundedrect];
[Hangbtn settitle:@ "Cancel" forstate:uicontrolstatenormal];
Hangbtn.frame = CGRectMake (Self.view.frame.size.width/2 -110, Topheight + 270, 220, 40);
[Hangbtn addtarget:self Action: @selector (PUSHTONEXTVC) forcontrolevents:uicontroleventtouchupinside];
HangBtn.layer.cornerRadius = 5;
[Hangbtn Settitlecolor:[uicolor Whitecolor] forstate:uicontrolstatenormal];
HangBtn.layer.masksToBounds = YES;
Hangbtn.backgroundcolor = COLORMAINBG;
[Self.view ADDSUBVIEW:HANGBTN];
[Self createtimer];
}
-(void) PUSHTONEXTVC
{
[Self dismissviewcontrolleranimated:yes completion:nil];
Adddevicebyhandviewcontroller *adddevice = [[Adddevicebyhandviewcontroller alloc] init];
[Self.navigationcontroller Pushviewcontroller:adddevice Animated:yes];
}
-(void) Viewdidappear: (BOOL) animated
{
[UIView animatewithduration:0.3 animations:^{
Self.view.backgroundColor = [Uicolor Whitecolor];
}];
Get camera Equipment
Avcapturedevice * device = [Avcapturedevice defaultdevicewithmediatype:avmediatypevideo];
Create an input stream
Avcapturedeviceinput * input = [avcapturedeviceinput deviceinputwithdevice:device error:nil];
Create an output stream
Avcapturemetadataoutput * output = [[Avcapturemetadataoutput alloc]init];
Set the agent to refresh in the main thread
[Output Setmetadataobjectsdelegate:self queue:dispatch_get_main_queue ()];
Float width = self.view.frame.size.width;//of the Avcapturevideopreviewlayer object
float height = self.view.frame.size.height;//of the Avcapturevideopreviewlayer object
Output.rectofinterest=cgrectmake (100/height, (Width/2 -110)/width, 220/height, 220/width);
CGRectMake (y, x, height, width);
Initializing linked objects
session = [[Avcapturesession alloc]init];
High quality acquisition Rate
[Session Setsessionpreset:avcapturesessionpresethigh];
[Session Addinput:input];
[Session Addoutput:output];
Set the encoding format supported by the scan Code (bar code and QR Code are compatible)
[Email protected] [Avmetadataobjecttypeqrcode,avmetadataobjecttypeean13code, Avmetadataobjecttypeean8code, Avmetadataobjecttypecode128code];
Avcapturevideopreviewlayer * layer = [Avcapturevideopreviewlayer layerwithsession:session];
Layer.videogravity=avlayervideogravityresizeaspectfill;
Layer.frame=self.view.layer.bounds;
[Self.view.layer Insertsublayer:layer atindex:0];
Start capturing
[Session startrunning];
[Self makeui];
}
#define LINE_SCAN_TIME 0.01//scan line from top to bottom scan calendar time (s)
-(void) Createtimer {
Starty = 110;
Self.scanlinetimer =
[Nstimer Scheduledtimerwithtimeinterval:line_scan_time
Target:self
Selector: @selector (movescanimg)
Userinfo:nil
Repeats:yes];
}
-(void) movescanimg
{
int StartX = SELF.VIEW.FRAME.SIZE.WIDTH/2-100;
Starty + = 1;
if (Starty > topheight + 210) {
Starty = Topheight + 10;
}
Self.lineImg.frame = CGRectMake (Startx,starty, 200, 7);
}
-(void) Viewdiddisappear: (BOOL) animated
{
[Self.scanlinetimer invalidate];
}
-(void) Captureoutput: (Avcaptureoutput *) captureoutput didoutputmetadataobjects: (Nsarray *) metadataobjects Fromconnection: (avcaptureconnection *) connection{
if (metadataobjects.count>0) {
[Session stoprunning];
Avmetadatamachinereadablecodeobject * MetadataObject = [metadataobjects objectatindex:0];
Output Scan String
NSLog (@ "result:%@", Metadataobject.stringvalue);
[Self Dismissviewcontrolleranimated:yes completion:^{
Self.getsysstring (Metadataobject.stringvalue);
}];
}
}
The page before entering this interface gets the scanned data
pushscan.getsysstring = ^ (NSString *str) {
Startconnectstr = str;
if (str) {
STR is the data you get.
This is where you get the data to do the work after sweeping the sweep.
}
};
Complete the full text, thank you for viewing.
iOS Development native scan QR code implementation and limit scan area rectofinterest some pits encountered