rookie one, has already written the blog's plan, but has not known from what began to write, this state lasted nearly half a year ... Recently more and more urgent feeling to the usual development and some of the things you learned to record, so I intend to write here! At the same time share with everyone!
Usually will see a lot of iOS app will use some fuzzy translucent effect, in fact, the realization of this effect is first in the original screen specific areas, get uiimage, processing this uiimage get the desired effect. The processed uiimage is then added to the current view.
Thus, the premise of doing this translucent blur effect is that the following method can achieve this function:
-(UIImage *) Viewsnapshot: (UIView *) View Withinrect: (cgrect) rect; {Uigraphicsbeginimagecontext (view.bounds.size); [View.layer Renderincontext:uigraphicsgetcurrentcontext ()]; UIImage *image = Uigraphicsgetimagefromcurrentimagecontext (); Uigraphicsendimagecontext (); Image = [UIImage imagewithcgimage:cgimagecreatewithimageinrect (image. Cgimage,rect)]; return image;
There are 2 parameters in this view and Rect,view: That is, the entire view, rect: After acquiring the current view, on this basis gets the rect range.
This makes it possible to get the TestView:
UIImage *shotimage = [[UIImage alloc] init]; [Self Viewsnapshot:testview withInRect:testView.bounds];
in some cases, capturing only one view image is not sufficient for our needs. If we want to get the entire screen, if there is a navigation bar on the current page of the phone, the interception of Self.view can not be intercepted in the navigation bar, the following method to solve the problem:
-(void) fullscreenshots{ appdelegate * appdelegate = [uiapplication sharedapplication].delegate; uiwindow *screenwindow = appdelegate.window; uigraphicsbeginimagecontext ( screenWindow.frame.size); [screenwindow.layer renderincontext: Uigraphicsgetcurrentcontext ()]; uiimage *image = Uigraphicsgetimagefromcurrentimagecontext (); uigraphicsendimagecontext (); uiimagewritetosavedphotosalbum (Image, nil, nil, nil); //will be credited to the album}
Method The last sentence of the code is to save to the phone's local album, when you get what you want, you can process the captured images, processing pictures Here I use gpuimage this open source framework, is very popular on GitHub. It is suggested to use Cocoapods to bring the frame to the project.
First initialize a gpuimgeview:
Gpuimageview *testview = [[Gpuimageview alloc] initWithFrame:self.view.bounds]; Testview.clipstobounds = YES; testView.layer.contentsGravity = Kcagravitytop;
then initialize a filter:
Gpuimagegaussianblurfilter *blurfilter = [[Gpuimagegaussianblurfilter alloc] init]; Blurfilter.blurradiusinpixels = 5.0f;
finally initialize a gpuimagepicture:
Gpuimagepicture *picture = [[Gpuimagepicture alloc] initwithimage:[self Fullscreenshots]]; [Picture Addtarget:blurfilter]; [Blurfilter Addtarget:testview]; [Picture processimagewithcompletionhandler:^{[Blurfilter removealltargets];}];
when we have executed the above code, the image with the blur effect will appear on the TestView.
This article is from the "iOS development" record point drip "blog, please be sure to keep this source http://zhanghx.blog.51cto.com/10654329/1688954
The iOS development code implements screen functionality and can also intercept a view blur effect