Working with the image using the core image

Source: Internet
Author: User
Tags image filter

In iOS and OS x, the Core image has three classes for working with images:

* Cifilter is a variable object that represents the result (effect). A filter object must have at least one input parameter and produce an output image.

* Ciimage is an immutable object that represents an image. You can synthesize image data or get it from the output of a file or other Cifilter object.

* Cicontext is an object that is drawn by the core Image after the filter is processed. A core Image context can be CPU-based or GPU-based.

(When using core image in your app, you'll need to add the following framework:CoreImage.framework (iOS), quartzcore.framework (OS X), and add header files to your Xcode project: <CoreImage/CoreImage.h> (IOS) <QuartzCore/CoreImage.h> (OS X)).

In iOS, the basic steps for using a filter on a picture:

1Cicontext *context = [Cicontext Contextwithoptions:nil];//12 3Ciimage *image = [Ciimage Imagewithcontentsofurl:myurl];//24 5Cifilter *filter = [Cifilter filterwithname:@"Cisepiatone"];//36 7 [Filter Setvalue:image Forkey:kciinputimagekey];8 9[Filter setvalue:@0.8fForkey:kciinputintensitykey];Ten   OneCiimage *result = [Filter Valueforkey:kcioutputimagekey];//4 A   -CGRect extent =[result extent]; -  theCgimageref cgimage = [context Createcgimage:result fromrect:extent];//5

1. Create a Cicontext object.

2. Create a Ciimage object.

3. Create a filter and set a value for its input parameters.

4. Get the output image.

5. Make Ciimage a core graphics image that is ready to be displayed or saved as a file.

About built-in filters:

Core Image has a large number of built-in filters. Core Image Filter Reference lists the filters and their characteristics, whether they are available on the iOS or OS X platform, and shows a picture that has been processed by the filter. Because the list of built-in filters is mutable, the core image provides a way to query the system for available filters.

A Filter category Specifies the type of Effect-blur, distortion, generator, and so forth-or it intended use-still images, Video, nonsquare pixels, and so on. A filter can is a member of more than one category. A filter also has a display name, which are the name to show to users and a filter name, which is the name of the must use to Access the filter programmatically.

Most filters have one or more parameters that let you control how you handle them.  Each input parameter has an attribute class that describes its data type, such as: NSNumber. An input parameter can optionally have other properties, such as: its default value, the allowable minimum and maximum values, the name of the parameter display, and the Cifilter Class Reference Other properties described in the.

For example, the Cicolormonochrome filter has three input parameters: the image to be processed, one monochrome, and the color brightness. You provide a picture, and you can have the option to set the color and brightness. Most filters, for non-image parameters, have a default value. When you do not set your own values for these default parameters, the Core image uses the default parameter values to process your images.

The properties of a filter are stored in the form of key-value pairs.

Core Image uses KVC, which means you can set or get Filter property values using the methods provided by Nskeyvaluecodingprotocol.

1. Create a core Image Context:

A core image context represents the destination of the drawing, and this destination determines whether the core image is processed using the GPU or CPU.

(iOS) If your app doesn't need to be displayed, you can create a cicontext using the following method:

Cicontext *context = [Cicontext Contextwithoptions:nil];

This method can be processed by selecting either the CPU or one of the GPUs. To determine which one to use, you can create an optional dictionary and add a key: Kcicontextusesoftwarerenderer and the appropriate Boolean value. CPU processing is slower than the GPU, but for GPU-processed images, the processed results must be copied to the CPU memory and converted to another type of image (such as UIImage object) to show.

If your app supports real-time image processing, you should create a Cicontext object from EAGL context instead of using contextwithoptions: and specify the GPU. The advantage is that the processed images are kept in the GPU without the need to copy to CPU memory. The first thing you have to do is create a EAGL context:

Eaglcontext *myeaglcontext = [[Eaglcontext alloc]initwithapi:keaglrenderingapiopengles2];

You should turn off color management by setting the color workspace to empty. Because color management can make the display slow.

Nsdictionary *options = @{kcicontextworkingcolorspace:[nsnull null]};

Cicontext *mycontext = [Cicontext contextwitheaglcontext:myeaglcontext options:options];

2. Create a Ciimage object:

How to create a Ciimage object:

3. Create a Cifilter object and set the value
You can create a specific type of filter by Filterwithname: This method. Parameter name the value of this string must match the name of the built-in filter. You can get the filter name list by querying the instructions in the System for Filters, or query the core Image filter reference to get the filter name.

In iOS, when you call Filterwithname: This method, the input value of the filter is set to the default value.
If you do not know the input parameters of a filter, you can get an array of them by Inputkeys this method. (or you can query the core Image filter reference for the input parameters of most built-in filters) in addition to the generator filter, the other filters require an input image. Some require two or more images or textures. You can change the default value for each input parameter by Setvalue:forkey: This method.

Such as:
Hueadjuest = [Cifilter filterwithname:@ "Cihueadjust"];
[Hueadjust SetDefaults]; This sentence OS X only necessary
This filter has two input parameters: input image and input angle. The input angle of the tint adjustment filter applies to the tonal position of the color space for HSL and HLS. This value range is from 0.0 to 2pi. A value of 0 indicates red, green is 2/3pi radians, and Blue is 4/3pi radians.
[Hueadjust setvalue:myciimage Forkey:kciinputimagekey];
[Hueadjust setValue: @2.094f Forkey:kciinputanglekey];

(
Hueadjust = [[Cifilter filterwithname:@ "Cihueadjust"] keysandvalues:
Kciinputimagekey, Myciimage,
Kciinputanglekey, @2.094f
NIL];

)



4. Get the output Image:

The output image can be obtained by retrieving the value of the Outputimage key:

Ciimage *result = [Hueadjust Valueforkey:kcioutimagekey];

Core Image does not perform any image processing until a method of actually renders the image (see Rendering th e resulting Output Image (page 21)). When you request the output image, Core image assembles the calculations, it needs to produce an output image and Stor Es those calculations (that's, image "recipe") in a Ciimage object. The actual image is a rendered (and hence, the calculations performed) if there is a explicit call to one of the image -drawing methods. See Rendering the resulting Output Image (page 21).

Deferring processing until rendering time makes Core Image fast and efficient. At rendering time, the Core Image can see if more than one filter needs to is applied to an Image. If So, it automatically concatenates multiple "recipes" to one operation, which means each pixel was processed only once Rather than many times. Figure 1-2 illustrates a multiple-operations workflow this Core Image can make more efficient. The final image is a scaled-down version of the original. For the case of a large image, applying color adjustment before scaling down the image requires more processing power than Scaling down the image and then applying color adjustment. By waiting until render time to apply filters, Core Image can determine that it's more efficient to perform these Operati ONS in reverse order.

5. Describe the result of the output image:

[Mycontext Drawimage:result inrect:destinationrect Fromrect:contextrect];

6. Ensure thread safety:

Cicontext and Ciimage objects are immutable, meaning they can be shared through thread safety. Multithreading can use the same CPU or GPU Cicontext objects to draw Ciimage objects.   However, this isn't the case for Cifilter objects, which was mutable. A Cifilter object cannot be shared by thread-safe. If your program is multithreaded, each thread must create its own Cifilter object. Otherwise, your program will show an exception.

Working with the image using the core image

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.