[IOS] pitfall's ALAsset (Assets Library Framework)
The Assets Library Framework can be used as a multi-selector on iOS. I will not discuss the selection of photos and videos. The current project is a bit similar to dropbox. You can select a photo in the device and then upload the file for you. The Assets Library Framework is used. ALAsset can be considered as a packaging class of the file you selected, from which you can get an object called ALAssetPresentation (defaultRepresentation ), then, the full screen, full-size, metadata, size, and other useful information can be displayed in the image. The problem arises: when you use the iPhone/iPad built-in Photos application to modify the photo and save it, when you use the Assets Library Framework for selection, you will see the modified thumbnail OK; the full screen graph (fullScreenImage) in defaprespresentation is also the modified OK; however,! A full-size image is an unmodified image. If you upload the File url provided by Asset directly, the source image will be displayed when most of the Image view software is opened! (At least the thumbnail generated by the server that our project is connected to is the thumbnail of the source image. Download this image and use imageView to see it as the source image! Go back to the Photos application and open the image and enter the editing status. A button is displayed, which is called "Revert to Original". Click "Restore. The principle is that Photos does not really save a new image, but writes the processing to the metadata of the source image through an attribute called "AdjustmentXMP. How can this be done? After research, the following solutions are obtained, hoping to help people who do not know: 1. determine whether the asset is an image 2. get the defaultRepresentation (rep) in asset 3. obtain the "AdjustmentXMP" (adj) of metadata in rep 4. if there is an adj, change it to a group of CIFilter 5. obtain the source image fullResolutionImage (img) 6. use CIFilter to "process" img one by one, and finally generate the desired image 7. do whatever you want... for example, the project stores the generated image as a temporary file and then uploads the sample code as follows (asset is the ALAsset object returned by the Assets Library Framework ): copy code 1 // process the image modified by the iOS built-in Photos 2 if ([asset valueForProperty: ALAsset PropertyType] isEqualToString: ALAssetTypePhoto]) {3 ALAssetRepresentation * rep = [asset defaultRepresentation]; 4 NSString * adj = [rep metadata] [@ "AdjustmentXMP"]; 5 if) {6 CGImageRef fullResImage = [rep fullResolutionImage]; 7 NSData * xmlData = [adj dataUsingEncoding: NSUTF8StringEncoding]; 8 CIImage * image = [CIImage imageWithCGImage: fullResImage] 9 NSError * error = nil; 10 NSArray * filter S = [CIFilter filterArrayFromSerializedXMP: xmlData11 inputImageExtent: [image extent] 12 error: & error]; 13 CIContext * context = [CIContext contextwitexceptions: nil]; 14 if (filters &&! Error) {15 for (CIFilter * filter in filters) {16 [filter setValue: image forKey: kCIInputImageKey]; 17 image = [filter outputImage]; 18} 19 fullResImage = [context createCGImage: image fromRect: [image extent]; 20 UIImage * result = [UIImage imageWithCGImage: fullResImage21 scale: [rep scale] 22 orientation :( UIImageOrientation) [rep orientation]; 23 // do whatever you want with the result image then.24} 25} 26}