IOS8 Core Image In Swift: automatically improves the use of images and built-in filters. ios8swift

Source: Internet
Author: User
Tags image filter

IOS8 Core Image In Swift: automatically improves the use of images and built-in filters. ios8swift

Compatible with Xcode 6 Beta 7.

Core Image is a powerful framework. It allows you to simply apply various filters to process images, such as changing the brightness, color, or exposure. It uses GPU (or CPU) to process image data and video frames very quickly and even in real time. It also hides all the details of underlying graphics processing and can be used simply through the provided API. It does not need to worry about how OpenGL or OpenGL ES fully utilizes GPU capabilities, you do not need to know what role GCD plays in it. Core Image processes all the details.

The Core Image framework provides us with these things:

  • Built-in Image Filter
  • Feature detection capabilities (such as face recognition)
  • Supports automatic image improvement
  • Allows you to combine multiple filters into a custom filter.


Automatic image improvement

First, let's take a simple example. Use the Single View Application project template to create a project. After the project is created, only one AppDelegate and one ViewController are created, and another Main is created. storyboard, in Main. A ViewController has been prepared in storyboard. We place a UIImageView in this ViewController and adjust its frame:


In addition, because UIImageView is stretched by default, we don't want it to be deformed and set its ContentModeAspect Fit. By the way, disable Auto Layout and Size Classes.

Finally, drag two buttons to display the source image, and one to automatically improve the image. The entire ViewController looks like this:


Next, add the corresponding IBAction method and IBOutlet attributes to ViewController:

Class ViewController: UIViewController {

@ IBOutlet var imageView: UIImageView!

Lazy var originalImage: UIImage = {

Return UIImage (named: "Image ")

}()

......

An imageView connected to the Storyboard also has an originalImage attribute that is only loaded once. This attribute will be used many times later. Here I will use the image for my entire project.

Then in ViewDidLoad, it is like this:

Override func viewDidLoad (){

Super. viewDidLoad ()

Self. imageView. layer. shadowOpacity = 0.8

Self. imageView. layer. shadowColor = UIColor. blackColor (). CGColor

Self. imageView. layer. shadowOffset = CGSize (width: 1, height: 1)

Self. imageView. image = originalImage

}

Only two things are done: one is to add a shadow border to the imageView to make it look good; the other is to assign the originalImage value to the imageView.

Display the source image with one line of code:

@ IBAction func showOriginalImage (){

Self. imageView. image = originalImage

}

Below is the code for automatic improvement. I will post it first, and then I will be more interested in an intuitive effect:

@ IBAction func autoAdjust (){

Var inputImage = CIImage (image: originalImage)

Let filters = inputImage. autoAdjustmentFilters () as [CIFilter]

For filter: CIFilter in filters {

Filter. setValue (inputImage, forKey: kCIInputImageKey)

InputImage = filter. outputImage

}

Self. imageView. image = UIImage (CIImage: inputImage)

}

After you connect IBAction and IBOutlet, you can see the following results after running:


Click "auto-improvement" to see the effect. The operation may be slow because we haven't done any optimization yet. If you think the improvement effect is not obvious, you can click the source image several times to compare it.

Although there are some problems, it does not prevent us from continuing to explore. The above automatic improvement Code explicitly uses two classes (why should I use the explicit term here ?) : CIImage and CIFilter, where:

  • CIImage: This is a model object. It stores the Data that can be used to build images. It can be the Data of images, a file, or an object output by CIFilter.
  • CIFilter: Filter. Different CIFilter instances can indicate different filter effects. Different filters can set different parameters, but they need at least one input parameter and can generate an output object.
In addition, the automatic improvement function of Core Image intelligently improves the histogram of images (histogram ?) To analyze the face area and metadata, you only need to input an Image as the input parameter to get a set of filters that can improve the Image. CIImage instances can be obtained through UIImage, and then through two APIs: autoAdjustmentFilters and autoadjustmentfilterswitexceptions: Get the filter array that can improve the image, in most cases, you may use the API that provides the Option dictionary, because you can set:
  • The direction of the Image, which is especially important for CIRedEyeCorrection, CIFaceBalance, and other filters, because the Core Image needs to accurately recognize the face.
  • Do you only need to remove the red eye (set kCIImageAutoAdjustEnhance to false ).
  • Whether to use all filters except the redeye removal (set kCIImageAutoAdjustRedEye to false ).
If you want to provide the Option dictionary, you can use it as follows:
NSDictionary *options = @{ CIDetectorImageOrientation :                 [[image properties] valueForKey:kCGImagePropertyOrientation] };NSArray *adjustments = [myImage autoAdjustmentFiltersWithOptions:options];
In this example, I will not pass in the Option dictionary because it does not involve the image direction. Want to know which filters are used for the automatic improvement function? You only need to print the filter object. Generally, these five filters are used:
  • CIRedEyeCorrection: Fixed various red eyes caused by the camera flash.
  • CIFaceBalance: Adjust skin color
  • CIVibrance: Improves the image saturation without affecting the skin color.
  • CIToneCurve: Improve the image contrast
  • CIHighlightShadowAdjust: Improves shadow details
In most cases, these filters are enough. As mentioned earlier, different CIFilter have different parameters. If you want to know the specific parameters of CIFilter, you can call its own InputKeysMethod to obtain a list of supported input parameters, or call its OutputKeysMethod to obtain the list of its output parameters (generally we only use outpuntImage), or directly call its AttributesMethod To obtain all its information, including its name, category, input parameter, output parameter, value range of each parameter, and default value. Call CIFilter's inputKeys method to view its input parameters:

For filter: CIFilter in filters {

Let inputKeys = filter. inputKeys ()

Print (filter. name ())

Println (inputKeys)

...

Print result:

CIFaceBalance[inputImage, inputOrigI, inputOrigQ, inputStrength, inputWarmth]CIVibrance[inputImage, inputAmount]CIToneCurve[inputImage, inputPoint0, inputPoint1, inputPoint2, inputPoint3, inputPoint4]CIHighlightShadowAdjust[inputImage, inputRadius, inputShadowAmount, inputHighlightAmount]

Almost all filters haveInputImageFor this input parameter, we can directly set parameters (such as kCIInputImageKey) with various keys preset by the system. The system has already preset most common keys. If you find that some key systems are not preset, you can directly use the string of the obtained key name as the key, for example:

Filter. setValue (inputImage, forKey: kCIInputImageKey)

// The two are set in the same way

Filter. setValue (inputImage, forKey: "inputImage ")

To automatically improve the image function, we don't need to know too much details, just set inputImage.

Then fill in the traps.

The above code has two problems: first, the image is obviously slow in every use of automatic improvement, and second, the image is deformed after automatic improvement. Comparison between the source image and the improved image:


I set the contentMode of UIImageViewAspect FitThat is, the image is scaled proportionally instead of deformation. If you set the background color of UIImageView to red, the image is displayed in red, the improved image is not red. Because Apple indicates that UIImage fully supports CIImage, the document does not point out the cause of this problem. I have referred to the following post:

Http://stackoverflow.com/questions/15878060/setting-uiimageview-content-mode-after-applying-a-cifilter

The UIImage obtained through the UIImage (CIImage :) method is not a CGImage-based standard UIImage, so it cannot be understood according to general display rules, therefore, we need to find a real UIImage in another way. The solution is described below.

We used the explicit word when we introduced CIImage and CIFilter earlier, because these two classes can be intuitively seen in the Code. CIImage provides image information and CIFilter provides filters, when we call CIFilter's outputImage method, the Core Image does not render the Image at this time. Instead, it calculates various parameters and stores the calculation results in the CIImage object, the third object is used for rendering only when the object is to be displayed. This object isCIContext.

CIContext is the key for Core Image to process images. It is similar to the CGContext of Core Graphics, but different from it, CIContext can be reused without having to create a new one each time, at the same time, when outputting CIImage, there must be one, in short,Core Image uses CIContext to render objects generated by CIFilter.

In the preceding example, CIContext is not used, but Core Image implicitly uses CIContext internally when UIImage (CIImage :) is called, that is, the work we need to do manually is automatically completed. However, there is a problem. Every time you call UIImage (CIImage :), it will re-create a CIContext object, which will not have a great impact when it is destroyed, however, when the filter is used repeatedly, the performance is very affected. To prevent this situation, we reuse the CIContext object so that ViewController can hold a lazy loading attribute:

Lazy var context: CIContext = {

Return CIContext (options: nil)

}()

CIContext requires a dictionary during initialization. You can use kCIContextUseSoftwareRenderer to create a CPU-based CIContext object. By default, the GPU-based CIContext object is created, the difference is that the GPU CIContext object can be processed faster, while the CPU-based CIContext object can be processed in the background in addition to larger images. We can upload nil to create a GPU-based CIContext object.


With reusable CIContext objects, you need to do this when creating a UIImage:

@ IBAction func autoAdjust (){

Var inputImage = CIImage (image: originalImage)

Let filters = inputImage. autoAdjustmentFilters () as [CIFilter]

For filter: CIFilter in filters {

Filter. setValue (inputImage, forKey: kCIInputImageKey)

InputImage = filter. outputImage

}

// Self. imageView. image = UIImage (CIImage: inputImage)

Let cgImage = context. createCGImage (inputImage, fromRect: inputImage. extent ())

Self. imageView. image = UIImage (CGImage: cgImage)

}

Although the first execution of automatic improvement will be a little slow (because the CIContext object needs to be created), the performance has improved a lot in the case of repeated execution, the second problem, ContentMode, is also solved. If there are no special cases, you should always use this method to create a CGImage and convert the CGImage to a UIImage.


Use various built-in filters to use CIFilter class methods FilterNamesInCategory ()You can get all filters in a category:

Func showFiltersInConsole (){

Let filterNames = CIFilter. filterNamesInCategory (kCICategoryBuiltIn)

Println (filterNames. count)

Println (filterNames)

For filterName in filterNames {

Let filter = CIFilter (name: filterName as String)

Let attributes = filter. attributes ()

Println (attributes)

}

}

In this method, the input category parameter is KCICategoryBuiltInTo output all filters of the iOS8 Core Image:
There are 127 types. Of course, not all filters are commonly used. KCICategoryColorEffectThis key is used to obtain some common filters, just like those in the iOS 7 camera application. These filters generally do not need to be set with any parameters (some of them can also be set with different parameters ), you only need to set inputImage as in the same text. Print out some filters in this category:
I selected one of the filters. This is a monochrome filter. Although it looks like a lot of content, it is not complicated. CIAttributeFilterDisplayName is its display name, And inputImage is a parameter, the details of each parameter are displayed in a dictionary, including the CIAttributeTypeImage and the Class (CIImage) of the parameter, followed by the category of the filter, A filter can belong to multiple categories at the same time. Finally, it is the string that the filter needs to provide during instantiation, that is, ciphotow.tmono. The above ciphotopolictinstant and the following ciphotopolictnoir and ciphotopolictprocess are similar to each other. Only one inputImage parameter is required, isn't it easy? The parameters of these filters have been set internally and can be used directly. For ease of use, we give ViewController A CIFilter attribute, instantiate CIFilter in other methods, and use a public method to display the image after the filter:

Class ViewController: UIViewController {

@ IBOutlet var imageView: UIImageView!

Lazy var originalImage: UIImage = {

Return UIImage (named: "Image ")

}()

Lazy var context: CIContext = {

Return CIContext (options: nil)

}()

Var filter: CIFilter!

......

// MARK:-Nostalgia

@ IBAction func photoEffectInstant (){

Filter = CIFilter (name: "ciphotow.tinstant ")

OutputImage ()

}

// MARK:-black and white

@ IBAction func photoeach tnoir (){

Filter = CIFilter (name: "ciphotow.tnoir ")

OutputImage ()

}

// MARK:-color

@ IBAction func photo=ttonal (){

Filter = CIFilter (name: "ciphotow.ttonal ")

OutputImage ()

}

// MARK:-Years

@ IBAction func photoEffectTransfer (){

Filter = CIFilter (name: "ciphotow.ttransfer ")

OutputImage ()

}

// MARK:-monochrome

@ IBAction func photoEffectMono (){

Filter = CIFilter (name: "ciphotow.tmono ")

OutputImage ()

}

// MARK:-fade

@ IBAction func photo=tfade (){

Filter = CIFilter (name: "ciphotow.tfade ")

OutputImage ()

}

// MARK:-Printing

@ IBAction func photoshoptprocess (){

Filter = CIFilter (name: "ciphotow.tprocess ")

OutputImage ()

}

// MARK:-chromium yellow

@ IBAction func photoEffectChrome (){

Filter = CIFilter (name: "ciphotow.tchrome ")

OutputImage ()

}

Func outputImage (){

Println (filter)

Let inputImage = CIImage (image: originalImage)

Filter. setValue (inputImage, forKey: kCIInputImageKey)

Let outputImage = filter. outputImage

Let cgImage = context. createCGImage (outputImage, fromRect: outputImage. extent ())

Self. imageView. image = UIImage (CGImage: cgImage)

}

After writing these items, bind various buttons and touch events on the UI. The UI looks like this:


Various filter effects after running:


The above is a simple use of the Core Image built-in filter. If you do not need more fine-grained control over the filter, the above method will be enough.


GitHub I will keep updated on GitHub.


UPDATED // MARK :-, Which corresponds to Objective-C's # Pragma mark-Same role.


References:

Https://developer.apple.com/library/mac/documentation/graphicsimaging/conceptual/CoreImaging/ci_intro/ci_intro.html




Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.