You can detect the user's screenshot (Home + Power) and ask the user if they want to send a picture of the screenshot just after clicking the Attach Function button, which is a very good user experience. So, I also want to realize this function.
Before iOS7, if the user screenshots, the system will automatically cancel all touch events on the screen (using touchescancelled:withevent: This method) then we can detect the invocation of this method, Then load the latest local image and then judge it to achieve our goal. However, after iOS 7, screenshots will no longer cancel the touch event on the screen, so apps like Snapchat and Facebook Poke are affected by the ability to rely on the behavior of the system as soon as iOS 7 is released.
If we do not take any new measures, we can allow the app to cycle through the latest photo in the background to see if it fits the screenshot feature. This approach works, but it's a dumb way to allow your program to access the album, and it will consume more system resources in the background loop.
Of course, Apple has closed down something that will certainly open up something else for you, and will not let you go to the brink.
IOS7 offers a new way to push:uiapplicationuserdidtakescreenshotnotification. Just subscribe as usual to know what time it is.
Note: The Uiapplicationuserdidtakescreenshotnotification will be displayed after completion . It is not possible to be notified before interception.
Hopefully Apple will add uiapplicationuserwilltakescreenshotnotification to the iOS8. (Only did, no would is obviously not Apple's style ...) )
Here is a small demo, detect the user screenshots, and get screenshots of photos, displayed in the lower right corner.
(Need to run on the real machine, at least, I don't know how to simulate the screenshot behavior (Home + Power), if you know, also hope to inform)
SOURCE git download Link: colin1994/takescreenshottest
One. Registration NOTICE:
Registration Notice [[Nsnotificationcenter defaultcenter] addobserver:self selector: @selector (userdidtakescreenshot:) name:uiapplicationuserdidtakescreenshotnotification Object:nil];
Two. To monitor the screenshot:
Perform the action, that is, implement the corresponding response function of the above notification--userdidtakescreenshot
Screenshot response-(void) Userdidtakescreenshot: (nsnotification *) notification{NSLog (@ "intercept screen detected"); Artificial screenshot, simulate the user screenshot behavior, get the film UIImage *image_ = [self imagewithscreenshot]; Add display Uiimageview *imgvphoto = [[Uiimageview Alloc]initwithimage:image_]; Imgvphoto.frame = CGRectMake (SELF.WINDOW.FRAME.SIZE.WIDTH/2, SELF.WINDOW.FRAME.SIZE.HEIGHT/2, SELF.WINDOW.FRAME.SIZE.WIDTH/2, SELF.WINDOW.FRAME.SIZE.HEIGHT/2); Add Border calayer * layer = [Imgvphoto layer]; Layer.bordercolor = [[Uicolor whitecolor] cgcolor]; Layer.borderwidth = 5.0f; Add four edge Shadow ImgvPhoto.layer.shadowColor = [Uicolor blackcolor]. Cgcolor; ImgvPhoto.layer.shadowOffset = cgsizemake (0, 0); imgvPhoto.layer.shadowOpacity = 0.5; ImgvPhoto.layer.shadowRadius = 10.0; Add two edge Shadow ImgvPhoto.layer.shadowColor = [Uicolor blackcolor]. Cgcolor; ImgvPhoto.layer.shadowOffset = Cgsizemake (4, 4); imgvPhoto.layer.shadowOpacity = 0.5; ImgvPhoto.layer.shadowRadius = 2.0; [SelF.window Addsubview:imgvphoto];}
Userdidtakescreenshot, I've done 3 things in total.
1. Print detected screenshots
2. Get the screenshot picture. Invoke [self imagewithscreenshot]; here the Imagewithscreenshot is an artificial screenshot, simulating the user screenshot operation, get screenshots picture.
3. Display the screenshot picture in the lower right corner of the screen 1/4 size, plus a white border and a shadow effect highlighted.
Three. Get screenshot pictures
/** * Intercept Current screen * * @return NSData * */-(NSData *) datawithscreenshotinpngformat{cgsize imageSize = Cgsizezero; uiinterfaceorientation orientation = [UIApplication sharedapplication].statusbarorientation; if (uiinterfaceorientationisportrait (orientation)) ImageSize = [UIScreen mainscreen].bounds.size; else imageSize = Cgsizemake ([UIScreen mainscreen].bounds.size.height, [UIScreen mainscreen].bounds.size.width); Uigraphicsbeginimagecontextwithoptions (imageSize, NO, 0); Cgcontextref context = Uigraphicsgetcurrentcontext (); For (UIWindow *window in [[[UIApplication sharedapplication] windows]) {cgcontextsavegstate (context); CGCONTEXTTRANSLATECTM (context, window.center.x, window.center.y); CGCONTEXTCONCATCTM (context, window.transform); CGCONTEXTTRANSLATECTM (Context,-window.bounds.size.width * window.layer.anchorpoint.x,-window.bounds.size.height * WINDOW.LAYER.ANCHORPOINT.Y); if (orientation = = UiinterfaceorientAtionlandscapeleft) {CGCONTEXTROTATECTM (context, m_pi_2); CGCONTEXTTRANSLATECTM (context, 0,-imagesize.width); } else if (orientation = = Uiinterfaceorientationlandscaperight) {CGCONTEXTROTATECTM (context,-m_ Pi_2); CGCONTEXTTRANSLATECTM (context,-imagesize.height, 0); } else if (orientation = = Uiinterfaceorientationportraitupsidedown) {CGCONTEXTROTATECTM (context, M_PI); CGCONTEXTTRANSLATECTM (context,-imagesize.width,-imagesize.height); } if ([Window Respondstoselector: @selector (drawviewhierarchyinrect:afterscreenupdates:)]) {[Wind ow DrawViewHierarchyInRect:window.bounds Afterscreenupdates:yes]; } else {[Window.layer renderincontext:context]; } cgcontextrestoregstate (context); } UIImage *image = Uigraphicsgetimagefromcurrentimagecontext (); Uigraphicsendimagecontext (); Return uiimagepngRepresentation (image);} /** * Returns the captured image * * @return UIImage * */-(UIImage *) imagewithscreenshot{nsdata *imagedata = [Self datawithscreenshot Inpngformat]; return [UIImage Imagewithdata:imagedata];}