When doing a project, we often need to process images to achieve performance optimization or meet requirements. The following are common scenarios:
Stretch Image
If the image materials used in the project can be obtained through stretching, do as much as possible. This has two obvious advantages: one is to reduce the size of the App installation package, and the other is to reduce the memory space occupied during App running. After all, the App UI is basically built on a large number of exquisite images. If all these images use images of the same size as the screen, this has a negative impact on App performance and installation volume.
For stretched images, it is applicable to ios 5 and later.
- (UIImage *)resizableImageWithCapInsets:(UIEdgeInsets)capInsets
This method only receives a parameter of the UIEdgeInsets type, you can set top, left, bottom, and right of UIEdgeInsets to specify the height, width, height, and width of the upper end cover. The distance value of this end cover is measured in pt (point, point). In a normal display, 1pt = 1pix; In a retina display, 1pt = 2pix. Note that if the end cover distance value is not an integer, there will be white lines in the stretched image.
Create a thumbnail
If there is a large image, we only want to display the specified size of the thumbnail content, you can do this: In the UIImage category, implement the following method, call the method to create a thumbnail
- (UIImage *)imageByScalingAndCroppingForSize:(CGSize)targetSize{ UIImage *sourceImage = self; UIImage *newImage = nil; CGSize imageSize = sourceImage.size; CGFloat width = imageSize.width; CGFloat height = imageSize.height; CGFloat targetWidth = targetSize.width; CGFloat targetHeight = targetSize.height; CGFloat scaleFactor = 0.0; CGFloat scaledWidth = targetWidth; CGFloat scaledHeight = targetHeight; CGPoint thumbnailPoint = CGPointMake(0.0,0.0); if (CGSizeEqualToSize(imageSize, targetSize) == NO) { CGFloat widthFactor = targetWidth / width; CGFloat heightFactor = targetHeight / height; if (widthFactor > heightFactor) scaleFactor = widthFactor; // scale to fit height else scaleFactor = heightFactor; // scale to fit width scaledWidth = width * scaleFactor; scaledHeight = height * scaleFactor; // center the image if (widthFactor > heightFactor) { thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5; } else if (widthFactor < heightFactor) { thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5; } } UIGraphicsBeginImageContext(targetSize); // this will crop CGRect thumbnailRect = CGRectZero; thumbnailRect.origin = thumbnailPoint; thumbnailRect.size.width = scaledWidth; thumbnailRect.size.height = scaledHeight; [sourceImage drawInRect:thumbnailRect]; newImage = UIGraphicsGetImageFromCurrentImageContext(); if(newImage == nil) NSLog(@"could not scale image"); //pop the context to get back to the default UIGraphicsEndImageContext(); return newImage;}
Solve image rotation Problems
In the ios program, you can use a system camera to take photos and select images from the album. After direct uploads, images that you can see on non-mac systems will rotate, that's because we didn't modify the image orientation through the image rotation attribute. You can use the following method to solve this problem:
@interface UIImage (fixOrientation) - (UIImage *)fixOrientation; @end @implementation UIImage (fixOrientation) - (UIImage *)fixOrientation { // No-op if the orientation is already correct if (self.imageOrientation == UIImageOrientationUp) return self; // We need to calculate the proper transformation to make the image upright. // We do it in 2 steps: Rotate if Left/Right/Down, and then flip if Mirrored. CGAffineTransform transform = CGAffineTransformIdentity; switch (self.imageOrientation) { case UIImageOrientationDown: case UIImageOrientationDownMirrored: transform = CGAffineTransformTranslate(transform, self.size.width, self.size.height); transform = CGAffineTransformRotate(transform, M_PI); break; case UIImageOrientationLeft: case UIImageOrientationLeftMirrored: transform = CGAffineTransformTranslate(transform, self.size.width, 0); transform = CGAffineTransformRotate(transform, M_PI_2); break; case UIImageOrientationRight: case UIImageOrientationRightMirrored: transform = CGAffineTransformTranslate(transform, 0, self.size.height); transform = CGAffineTransformRotate(transform, -M_PI_2); break; } switch (self.imageOrientation) { case UIImageOrientationUpMirrored: case UIImageOrientationDownMirrored: transform = CGAffineTransformTranslate(transform, self.size.width, 0); transform = CGAffineTransformScale(transform, -1, 1); break; case UIImageOrientationLeftMirrored: case UIImageOrientationRightMirrored: transform = CGAffineTransformTranslate(transform, self.size.height, 0); transform = CGAffineTransformScale(transform, -1, 1); break; } // Now we draw the underlying CGImage into a new context, applying the transform // calculated above. CGContextRef ctx = CGBitmapContextCreate(NULL, self.size.width, self.size.height, CGImageGetBitsPerComponent(self.CGImage), 0, CGImageGetColorSpace(self.CGImage), CGImageGetBitmapInfo(self.CGImage)); CGContextConcatCTM(ctx, transform); switch (self.imageOrientation) { case UIImageOrientationLeft: case UIImageOrientationLeftMirrored: case UIImageOrientationRight: case UIImageOrientationRightMirrored: // Grr... CGContextDrawImage(ctx, CGRectMake(0,0,self.size.height,self.size.width), self.CGImage); break; default: CGContextDrawImage(ctx, CGRectMake(0,0,self.size.width,self.size.height), self.CGImage); break; } // And now we just create a new UIImage from the drawing context CGImageRef cgimg = CGBitmapContextCreateImage(ctx); UIImage *img = [UIImage imageWithCGImage:cgimg]; CGContextRelease(ctx); CGImageRelease(cgimg); return img;} @end
Image Encoding and uploading
Sometimes we need to upload the image data to the server as a string. When converting a UIImage object to NSData and then converting it to NSString, there will be garbled characters in the NSString object. In this case, you need to encode the NSData object before converting NSData to NSString.
# Import "UIImage + Ext. h "# import" GTMBase64.h "@ interface UIImage (Ext)-(NSString *) convertToString; @ end @ implementation UIImage (Ext)-(NSString *) convertToString {if (! Self) {return nil;} NSData * imgData = UIImageJPEGRepresentation (self, 0.5); NSData * encode = [GTMBase64 encodeData: imgData]; // base64 encoded NSData (solves the garbled problem) NSString * imgStr = [[NSString alloc] initWithData: encode encoding: NSUTF8StringEncoding]; return imgStr;} @ end
Write images to disk
To store images to a local disk, you must first convert the image object to an NSData object, and then call the writeToFile: interface to write
- (BOOL)writeToFile:(NSString *)path atomically:(BOOL)useAuxiliaryFile;