"Reprinted from Http://www.cocoachina.com/ios/20141223/10731.html"
Although iOS has long supported the use of blur effects for processing images, the translucent blur has been widely used, especially after iOS7. Including this year's latest release of IOS8 also follows this design, even in OS X version 10.10 Yosemite also began to use a large number of translucent blur.
In iOS development, we have a lot of options for translucent blur effects, and here are some common ways or tools.
0. Core Image
As a leader in design and experience, Apple's own support for image effects and image processing must be very good, with the API of core image appearing on the iOS platform after 5.0. The API for Core image is placed in the Coreimage.framework library.
On both iOS and OS X platforms, the core image provides a large number of filters, which is one of the core image libraries. According to official documentation, there are more than 120 kinds of filter on OS X, and more than 90 on iOS.
Here's a sample code for a core image that blurs:
12345678 |
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *image = [CIImage imageWithContentsOfURL:imageURL];
CIFilter *filter = [CIFilter filterWithName:@
"CIGaussianBlur"
];
[filter setValue:image forKey:kCIInputImageKey];
[filter setValue:@2.0f forKey: @
"inputRadius"
];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGImageRef outImage = [context createCGImage: result fromRect:[result extent]];
UIImage * blurImage = [UIImage imageWithCGImage:outImage];
|
As you can see here, the Core image is more flexible, filter is created by the name of the string, such as the Gaussian blur filter is "Cigaussianblur", here is an official list to see.
In addition to the multiple filter mentioned here, the core image also provides classes such as Cidetector, which can support face recognition, and more support for core image on OS X.
1. Gpuimage
In addition to Apple's official offer, third parties also have tools for image processing in this area. A man named Brad Larson a set of open source libraries called gpuimage. In the same way, a lot of filter is provided.
The same is done with Gaussian blur, with Gpuimage can do this:
1234 |
GPUImageGaussianBlurFilter * blurFilter = [[GPUImageGaussianBlurFilter alloc] init]; blurFilter.blurRadiusInPixels = 2.0; UIImage * image = [UIImage imageNamed:@ "xxx" ]; UIImage *blurredImage = [blurFilter imageByFilteringImage:image]; |
At least it looks like the code is much simpler than using the core image.
2. Vimage
In fact, after saying the above core image and Gpuimage, in many cases it is enough. Now let's look at one more, that's vimage. Vimage is also an apple-launched library in Accelerate.framework.
Accelerate this framework is mainly used for digital signal processing, image processing-related vector, matrix Operations Library. We can think that our images are made up of vectors or matrix data, and since the accelerate provides an efficient API for math operations, it is natural for us to do all kinds of processing for images.
Based on Vimage, we can do the blur effect directly based on the image processing principle, or use the existing tools. Uiimage+imageeffects is a very good image processing library, to see the name is also known to UIImage do the classification extension. This tool is widely used.
3. Performance and Selection
Now that we know 3 ways to do translucent blur, which one should we choose? That's a problem.
From the system version of support, these are similar, are iOS4, iOS5 support, for the developers in the IOS8 era, this compatibility has been enough.
Core image is Apple's own image processing library, it is good, if Apple itself in a version of the optimized processing, naturally better. Mainly used to be more troublesome, but also know the name of filter.
Gpuimage comes from a third party, but opening up is easier to use, in many scenarios due to the choice of core image.
Image obfuscation is a very complex calculation, and the final performance is often seen. At this point, I prefer to choose Vimage.
In the iOS app I developed, I chose Vimage, the starting point is performance, which is not to say there is a very precise benchmark. But in several debugging mainstream models, including 5c, 5s, etc., in the fuzzy radius (blur radius) reached 10 or so, with animation, vimage processing time will be significantly shorter, will not "lag".
The above is my implementation of the translucent fuzzy effect on iOS to achieve the finishing.
"Reprint" iOS development uses the translucent blur effect method to organize