CoreImage and filter effect, CoreImage filter effect
Reference: http://blog.sina.com.cn/s/blog_92ac2c5b0101cm5b.html
ThanksExit East Gate
For iOS developerscore
This word must not be unfamiliar. It always represents or the underlying, or the framework of the underlying, or the Apple.
CoreImage allows you to simply apply filters to process images, such as modifying saturation, brightness, and contrast (very friendly for those who know PS ).
It uses GPU (or CPU, depending on the customer) to process image data and video frames very quickly or even in real time. Multiple Core Image filters can be combined to produce multiple filter effects at a time.
Only a portion of the Core Image filters on Mac can be used on iOS. However, as the number of filters available increases, APIs can be used to discover new filter attributes.
We can use the following code to print the supported filters:
NSArray * filters = [CIFilter filterNamesInCategory: kCICategoryBuiltIn]; NSLog (@ "% @", filters); [filters count]; NSLog (@ "A total of % d CIFilter filter effects", [filters count]);
My output is: 127
CoreImage Overview
Before we start, let's talk about the most important classes in the Core Image framework:
CIContext
. All Image processing is completed in a CIContext, which is like a Core Image processor or OpenGL context.
CIImage
This class Stores image data. It can be constructed from UIImage, image file, or pixel data.
CIFilter
The filter class contains a dictionary structure that defines their own attributes for various filters. There are many filters, such as the brightness filter, the color inversion filter, and the clipping filter.
Basic Image Filter
As the first attempt, we first simply let the image display on the screen after passing a CIFilter. Every time we want to apply a CIFilter, we have to take the following four steps:
// 1NSString *filePath = [[NSBundle mainBundle] pathForResource:@"image" ofType:@"png"];NSURL *fileNameAndPath = [NSURL fileURLWithPath:filePath];// 2CIImage *beginImage = [CIImage imageWithContentsOfURL:fileNameAndPath];// 3CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone" keysAndValues: kCIInputImageKey, beginImage, @"inputIntensity", @0.8, nil];CIImage *outputImage = [filter outputImage];// 4UIImage *newImage = [UIImage imageWithCIImage:outputImage];self.imageView.image = newImage;
Let's take a look at what the code has done.
Source image:
:
Put it in context
Before proceeding to the next step, an optimization method is very practical. As mentioned above, you need a CIContext for CIFilter, but we didn't mention this object in the above example. Because the UIImage method we call (imageWithCIImage) has automatically completed this step for us. It generates a CIContext and uses it to process image filtering. This makes it easy to call the Core Image interface.
However, a major problem is that each call generates a CIContext. CIContext can be reused to improve performance and efficiency. For example, if you want to use a slider to select a filter parameter value, a CIContext will be automatically generated every time you change the filter parameter, making the performance very poor.
Let's find a good way to solve this problem. Delete the code you added to viewDidLoad and replace it with the following code:
CIImage *beginImage = [CIImage imageWithContentsOfURL:fileNameAndPath];// 1CIContext *context = [CIContext contextWithOptions:nil];CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone" keysAndValues: kCIInputImageKey, beginImage, @"inputIntensity", @0.8, nil];CIImage *outputImage = [filter outputImage];// 2CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];// 3UIImage *newImage = [UIImage imageWithCGImage:cgimg];self.imageView.image = newImage;// 4CGImageRelease(cgimg);
Let me explain this part of the code step by step.
In this example, the creation of CIContext is not much different from that created by yourself. However, when dynamically changing filter parameters, we can see significant performance differences.