Structure
Afurlresponseserialization is responsible for parsing the network return data, check whether the data is legitimate, the NSData data into the corresponding object, built-in converter has json,xml,plist,image, Users can easily inherit the base class Afhttpresponseserializer to parse more data formats, afnetworking This set of response parsing mechanism is simple, mainly two methods:
1.-validateresponse:data:error:
This method of base class Afhttpresponseserializer detects whether the HTTP status code and data type returned are legitimate. Properties Acceptablestatuscodes and Acceptablecontenttypes Specify a valid status code and data type, such as jsonserialization to set Acceptablecontenttypes to @ " Application/json ", @" Text/json ", @" Text/javascript ", if not one of these three, validates the failure and returns the corresponding Nserror object. The generic subclass does not need to override this method, just set the Acceptablestatuscodes and Acceptablecontenttypes to the right.
2.-responseobjectforresponse:data:error:
This method parses the data, turns the NSData into the corresponding object, and the upper layer Afurlconnectionoperation calls the method to get the converted object.
Before parsing the data, the above-mentioned Validateresponse method is used to detect whether the HTTP response is legitimate, it is important to note that even if the detection returns illegal here, it will continue to parse the data generation object, because it is possible that the error message is in the returned data.
If Validateresponse returns error, the parsing data here is error, then there are two error objects, how to return to the upper layer? The processing here is to save the parsed data of the Nserror object to Validateresponse nserror userinfo, as Underlyingerror, Nserror specifically gave a nsunderlyingerrorkey as a key value that contains the error.
The rest is nsecurecoding related methods, if the subclass adds the property, need to add the corresponding nsecurecoding method.
JSON parsing
Afjsonresponseserializer using the system's built-in nsjsonserialization parsing Json,nsjson only supports parsing UTF8 encoded data (and Utf-16le and the like, which are not commonly used), So first turn the returned data into UTF8 format. Here will try to use the HTTP return encoding type and its own set of stringencoding to decode the data into a string nsstring, and then nsstring with UTF8 encoding to NSData, It is then parsed into an object return with Nsjsonserialization.
The above procedure is Nsdata->nsstring->nsdata->nsobject, there is a problem, if you can determine that the server is returning UTF8 encoded JSON data, that nsdata->nsstring-> NSData These two steps is meaningless, and these two steps are two times codec, very wasteful performance, so if you determine the server to return UTF8 encoded data, it is recommended to write a jsonresponseserializer, skip these two steps.
In addition Afjsonresponseserializer specifically wrote a method to remove Nsnull, directly the value of the object is nsnull the key to remove, but also pretty intimate, if not removed, the upper layer is easy to ignore the data type, judge whether the data nil did not judge whether Nsnull, The wrong call resulted in the core.
Image decompression
When we call UIImage method Imagewithdata: method to turn data into UIImage object, in fact, the UIImage object is not ready to render to the screen data, now the network image PNG and JPG are compressed format, Need to extract them into the bitmap before rendering to the screen, if you do not do any processing, when you put UIImage to Uiimageview, before rendering the bottom will be judged to UIImage object is not decompressed, there is no bitmap data, this will be in the main thread to extract the image operation, Render to the screen again. This decompression operation is more time-consuming and can cause slow UI lag if it is done in the main thread.
In addition to parsing the return data into UIImage, Afimageresponseserializer also extracts the image data, which is a thread that is dedicated to the afnetworking thread, See afurlconnectionoperation), after processing the upper layer using the returned uiimage in the main thread rendering without the need to do the decompression, the main thread to reduce the burden, reduce the UI Kaka problem.
The implementation on the Afinflatedimagefromresponsewithdataatscale, create a canvas, the uiimage painting on the canvas, and then save this canvas to uiimage back to the upper layer. Only JPG and PNG will try to do the decompression operation, if the decompression failed, or encountered cmky color format jpg, or the image is too large (the extracted bitmap too much memory, a pixel of 3-4 bytes, do not make the memory burst), directly back to the uncompressed image.
In addition to see in the code of iOS need this manual decompression, MacOS already has a packaged object Nsbitmapimagerep can do this thing.
About the picture decompression, there are several questions unclear:
1. Originally thought call imagewithdata method only holds the data, did not do the decompression related thing, later saw the call stack discovery has done some decompression operation, from the invocation name to see carries on the Huffman decoding, does not know also will continue to decode JPG which step.
2. Manual decompression by the tablet is done in the CPU, if not manually decompression, put the picture into the layer, let the bottom automatic do this thing, it will be used by the GPU decompression. I do not know how to use the GPU decompression and CPU decompression speed will be poor, if the GPU speed, even in the main thread to do decompression, also become acceptable, do not need to manually extract such optimizations, but there is no way to detect the speed of the GPU decompression.
P.s. About the picture decompression, there is a very good article: Avoiding image decompression sickness
SOURCE Comment
| 1 |
AFURLResponseSerialization.m |
AFNetworking2.0 source parsing < four >