App development process-image processing tools and app Image processing tools
First, list the methods provided in the tool class:
/*** Obtain the Blur view based on the original view and glass pattern and automatically use it as the subview of the original view (if you do not need to use it as a subview, call removeFromSuperview by yourself) */+ (UIView *) getBlurEffectViewWithOriginalView :( UIView *) originalView style :( ImageHelperBlurEffectStyle) style;/*** obtain a new image based on the original image and glass pattern */+ (UIImage *) images :( UIImage *) originalImage style :( ImageHelperBlurEffectStyle) style;/*** obtain the new image */+ (UIImage *) getImageWithOriginalImage :( UIImage *) based on the proportional scaling coefficient of the original image *) originalImage scale :( CGFloat) scale;/*** obtain the new image */+ (UIImage *) getImageWithOriginalImage (UIImage *) originalImage scaleMaxSize :( CGSize) based on the original image proportional Scaling) scaleMaxSize;/*** obtain the new size */+ (CGSize) getImageSizeWithOriginalImage (UIImage *) originalImage scaleMaxSize :( CGSize) scaleMaxSize based on the original image proportional scaling; /*** completely fill the size of the original image to get the new image */+ (UIImage *) getImageWithOriginalImage :( UIImage *) originalImage fillSize :( CGSize) fillSize; /*** crop the area based on the original image to obtain the new image */+ (UIImage *) getImageWithOriginalImage :( UIImage *) originalImage cutFrame :( CGRect) cutFrame;/*** crop the area based on the color, obtain the solid color new image in unit size */+ (UIImage *) getImageWithColor :( UIColor *) color;/*** obtain the snapshot */+ (UIImage *) Based on The view *) getSnapshotWithView :( UIView *) view;/*** full screen, excluding the status bar */+ (UIImage *) getFullScreenSnapshot;
Note:
1. Obtain the image of the frosted glass effect (Gaussian Blur)
After iOS8, UIBlurEffect and uivisualiztview classes are provided to conveniently generate Gaussian Blur views. Then, you only need to subview the target view to see the effect. On iOS7, You need to implement it on your own. However, Apple provides a UIImage + ImageEffects Classification on WWDC 2013 to generate Gaussian blur images. Add the category to the project Categories directory and reference it in ImageHelper. Because UIBlurEffectStyle appears after iOS8, you have customized an enumeration type ImageHelperBlurEffectStyle to make it available in iOS7. The implementation code is as follows:
+ (UIView *)getBlurEffectViewWithOriginalView:(UIView *)originalView style:(ImageHelperBlurEffectStyle)style{ if (DeviceIOSVersionAbove(8)) { UIBlurEffectStyle blurStyle; switch (style) { case ImageHelperBlurEffectStyleExtraLight: { blurStyle = UIBlurEffectStyleExtraLight; break; } case ImageHelperBlurEffectStyleLight: { blurStyle = UIBlurEffectStyleLight; break; } case ImageHelperBlurEffectStyleDark: { blurStyle = UIBlurEffectStyleDark; break; } } UIBlurEffect *effect = [UIBlurEffect effectWithStyle:blurStyle]; UIVisualEffectView *effectView = [[UIVisualEffectView alloc] initWithEffect:effect]; effectView.frame = originalView.bounds; [originalView addSubview:effectView]; return effectView; } else { UIImage *originalImage = [self getSnapshotWithView:originalView]; UIImage *blurImage = [self getBlurEffectImageWithOriginalImage:originalImage style:style]; UIImageView *effectView = [[UIImageView alloc] initWithFrame:originalView.bounds]; [effectView setImage:blurImage]; [originalView addSubview:effectView]; return effectView; }}+ (UIImage *)getBlurEffectImageWithOriginalImage:(UIImage *)originalImage style:(ImageHelperBlurEffectStyle)style{ UIImage *newImage; switch (style) { case ImageHelperBlurEffectStyleExtraLight: { newImage = [originalImage applyExtraLightEffect]; break; } case ImageHelperBlurEffectStyleLight: { newImage = [originalImage applyLightEffect]; break; } case ImageHelperBlurEffectStyleDark: { newImage = [originalImage applyDarkEffect]; break; } } return newImage;}
2. provides a series of methods for proportional scaling of images and cropping methods. The basic idea is to specify the drawing size (the size of the bitmap) in the current image context, draw the corresponding image to the specified position, and then generate the final image. Sample Code for cropping an image:
+ (UIImage *)getImageWithOriginalImage:(UIImage *)originalImage cutFrame:(CGRect)cutFrame{ CGSize newSize = cutFrame.size; UIGraphicsBeginImageContext(newSize); [originalImage drawInRect:CGRectMake(-cutFrame.origin.x, -cutFrame.origin.y, cutFrame.size.width, cutFrame.size.height)]; UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); return newImage;}
3. screenshot Method
Method to use:
UIGraphicsBeginImageContextWithOptions (CGSize size, BOOL opaque, CGFloat scale ). The first parameter still specifies the image context drawing size, the second parameter specifies whether the image is opaque, and the third parameter is the proportional scaling coefficient. If it is 0.0, it indicates that it is consistent with the factor on the device's main screen.
CALayer's renderInContext :( CGContextRef) ctx method, rendering all layers to a certain context. We recommend that you set the context of the current image. Finally, the image is obtained.
+ (UIImage *)getSnapshotWithView:(UIView *)view{ UIGraphicsBeginImageContextWithOptions(view.bounds.size, YES, 0.0); [view.layer renderInContext:UIGraphicsGetCurrentContext()]; UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); return newImage;}+ (UIImage *)getFullScreenSnapshot{ return [self getSnapshotWithView:[UIApplication sharedApplication].keyWindow];}
The status bar is not included in full screen mode. The status bar cannot be obtained directly because it is not in the application window. However, you can use private APIs to obtain screenshots of the system.
Test content
UIImage *icon = LOADIMAGE(AppIcon); UIImage *testImg; testImg = [ImageHelper getImageWithOriginalImage:icon scale:2]; LOG(@"%@", testImg); testImg = [ImageHelper getImageWithOriginalImage:icon scaleMaxSize:CGSizeMake(100, 90)]; LOG(@"%@", testImg); testImg = [ImageHelper getImageWithOriginalImage:icon fillSize:CGSizeMake(100, 90)]; LOG(@"%@", testImg); testImg = [ImageHelper getImageWithOriginalImage:icon cutFrame:CGRectMake(10, 10, 50, 50)]; LOG(@"%@", testImg); testImg = [ImageHelper getImageWithColor:COLOR(255, 120, 100)]; LOG(@"%@", testImg); testImg = [ImageHelper getSnapshotWithView:self.view]; LOG(@"%@", testImg); testImg = [ImageHelper getFullScreenSnapShot]; LOG(@"%@", testImg); testImg = [ImageHelper getBlurEffectImageWithOriginalImage:testImg style:ImageHelperBlurEffectStyleDark]; LOG(@"%@", testImg); UIView *coverView = [ImageHelper getBlurEffectViewWithOriginalView:[UIApplication sharedApplication].keyWindow style:ImageHelperBlurEffectStyleDark];
2016-09-13 19:05:11.995 base[33087:2301853] <UIImage: 0x7ffaf97e8d00>, {120, 120}2016-09-13 19:05:11.997 base[33087:2301853] <UIImage: 0x7ffaf97e9610>, {90, 90}2016-09-13 19:05:11.999 base[33087:2301853] <UIImage: 0x7ffaf950a330>, {100, 90}2016-09-13 19:05:12.001 base[33087:2301853] <UIImage: 0x7ffaf9463630>, {50, 50}2016-09-13 19:05:12.002 base[33087:2301853] <UIImage: 0x7ffaf950a330>, {1, 1}2016-09-13 19:05:12.007 base[33087:2301853] <UIImage: 0x7ffaf96004b0>, {375, 667}2016-09-13 19:05:12.013 base[33087:2301853] <UIImage: 0x7ffaf950a330>, {375, 667}2016-09-13 19:05:12.040 base[33087:2301853] <UIImage: 0x7ffaf9506f30>, {375, 667}
1. You can debug the code in one step to view the content of the testImg image:
2. [ImageHelper getSnapshotWithView: self. view]; after the self. view in this line of code generates a snapshot, the size is {375,667}, but if you add the code in the viewDidLoad method:
Self. edgesForExtendedLayout = UIRectEdgeNone;
The size of the re-output is {375,603}, because the default edgesForExtendedLayout attribute is UIRectEdgeAll, which requires attention when processing the UI layout. The parent class of UIViewContoller recorded in the future will also be mentioned.
Base project Updated: git@github.com: ALongWay/base. git