A.
Blur using the Vimage API
The new Vimage API is available in iOS5.0, which belongs to Accelerate.framework, so if you want to use it, you need to add the Framework to the project. The fuzzy algorithm uses the vimageboxconvolve_argb8888 function.
1-(UIImage *) Blurredimagewithradius: (cgfloat) Radius iterations: (nsuinteger) iterations Tintcolor: (Uicolor *) Tintcolor2 3 {4 5 //image must be nonzero size6 7 if(Floorf (self.size.width) * Floorf (self.size.height) <=0.0f)returnSelf ;8 9 //Boxsize must is an odd integerTen Oneuint32_t boxsize = (uint32_t) (RADIUS *Self.scale); A - if(Boxsize%2==0) Boxsize + +; - the //Create image Buffers - -Cgimageref Imageref =Self . Cgimage; - + Vimage_buffer buffer1, buffer2; - +Buffer1.width = Buffer2.width =cgimagegetwidth (imageref); A atBuffer1.height = Buffer2.height =cgimagegetheight (imageref); - -Buffer1.rowbytes = Buffer2.rowbytes =Cgimagegetbytesperrow (imageref); - -size_t bytes = Buffer1.rowbytes *Buffer1.height; - inBuffer1.data =malloc (bytes); - toBuffer2.data =malloc (bytes); + - //Create temp buffer the * void*tempbuffer = malloc ((size_t) vimageboxconvolve_argb8888 (&buffer1, &buffer2, NULL,0,0, Boxsize, Boxsize, $ Panax NotoginsengNULL, Kvimageedgeextend +kvimagegettempbuffersize)); - the //Copy Image Data + ACfdataref DataSource =Cgdataprovidercopydata (Cgimagegetdataprovider (Imageref)); the + memcpy (Buffer1.data, Cfdatagetbyteptr (dataSource), bytes); - $ cfrelease (dataSource); $ - for(Nsuinteger i =0; I < iterations; i++) - the { - Wuyi //perform Blur the -vimageboxconvolve_argb8888 (&buffer1, &buffer2, Tempbuffer,0,0, Boxsize, Boxsize, NULL, kvimageedgeextend); Wu - //Swap buffers About $ void*temp =Buffer1.data; - -Buffer1.data =Buffer2.data; - ABuffer2.data =temp; + the } - $ //Free buffers the the Free (buffer2.data); the the Free (tempbuffer); - in //Create image context from buffer the theCgcontextref CTX =cgbitmapcontextcreate (Buffer1.data, Buffer1.width, Buffer1.height, About the 8, Buffer1.rowbytes, Cgimagegetcolorspace (imageref), the the Cgimagegetbitmapinfo (Imageref)); + - //Apply Tint the Bayi if(Tintcolor && Cgcolorgetalpha (Tintcolor.cgcolor) >0.0f) the the { - -Cgcontextsetfillcolorwithcolor (CTX, [Tintcolor colorwithalphacomponent:0.25]. Cgcolor); the the Cgcontextsetblendmode (CTX, kcgblendmodepluslighter); the theCgcontextfillrect (CTX, CGRectMake (0,0, Buffer1.width, Buffer1.height)); - the } the the //create image from context94 theImageref =cgbitmapcontextcreateimage (CTX); the theUIImage *image =[UIImage imagewithcgimage:imageref scale:self.scale orientation:self.imageOrientation];98 About cgimagerelease (imageref); - 101 cgcontextrelease (CTX);102 103 Free (buffer1.data);104 the returnimage;106 107}
Click Expand Code
B.
is to use the filter effect provided by Apple in Coreimage, but this method is inefficient and needs to be converted relatively long
CPU Rendering: Slow inefficient, preferably in a sub-thread, in order to avoid thread blocking.
1-(UIImage *) blur{2 3Cicontext *context =[Cicontext Contextwithoptions:nil];4 5Ciimage *imagetoblur =[[Ciimage alloc]initwithimage:_imgview.image];6 7Cifilter *filter = [Cifilter filterwithname:@"Cigaussianblur"Keysandvalues:kciinputimagekey,imagetoblur, nil];8 9_outputciimage =[Filter outputimage];Ten OneUIImage *img =[UIImage imagewithcgimage:[context createcgimage:_outputciimage fromrect:_outputciimage.extent]; A - returnimg; - the}
C.ios8 the new functionality. Especially convenient, can also support real-time blur, the disadvantage is only iOS8 above use
1 -(ibaction) Ios8bluraction: (ID) sender { 2 3 uiblureffect *beffect = [Uiblureffect effectwithstyle:uiblureffectstyleextralight]; 4 5 Uivisualeffectview *view = [[Uivisualeffectview alloc]initwitheffect:beffect]; 6 7 view.frame = self.bounds; 8 9 [self addsubview:view]; Ten One}
D.gpuimage
Http://www.cocoachina.com/industry/20140210/7793.html
Blur image Processing