Ios 32-bit and 16-bit MD5 Encryption
During IOS development, we usually adopt encryption methods to ensure data security. Common encryption methods include Base64 encryption and MD5 encryption. Base64 encryption is reversible. Currently, MD5 encryption is generally irreversible. During the development of an App, there is a "sign" field in the request, and the value corresponding to this key is an MD5 encrypted field, the android colleague next to him asked the php backend whether MD5 encryption is 32-bit or 16-bit. Since I have never noticed it before, I searched for it. Now I will summarize it a little: MD5 is the Message-Digest Algorithm 5 (Information-Digest Algorithm 5), which is used to ensure the integrity and consistency of information transmission. It is one of the widely used Hash Algorithms in computers (also translated digest algorithms and hash algorithms). mainstream programming languages generally have MD5 implementations. The function of MD5 is to compress large-capacity information into a confidential format before signing a private key using digital signature software (that is, to convert a byte string of any length into a certain length of sixteen number string ). (Reference from Baidu encyclopedia) Pay attention to the generation of "must be long". How long is this "must be long! I read a lot of materials, including Wikipedia and some forums, saying that MD5 is actually produced by a fixed 128-bit algorithm, that is, binary bits of 0 and 1, in actual application development, it is usually output in hexadecimal notation, so it is exactly a 32-bit hexadecimal notation. To put it bluntly, It is 32 hexadecimal digits. Ios MD5 encryption method: copy the Code # import <CommonCrypto/CommonDigest. h>-(NSString *) md5 :( NSString *) str {const char * cStr = [str UTF8String]; unsigned char result [16]; CC_MD5 (cStr, strlen (cStr ), result); // This is the md5 call return [NSString stringWithFormat: @ "% 02x % 02x % 02x % 02x % 02x % 02x % 02x % 02x % 02x % 02x % 02x % 02x % 02x % 02x % 02x % 02x ", result [0], result [1], result [2], result [3], result [4], result [5], result [6], resu Lt [7], result [8], result [9], result [10], result [11], result [12], result [13], result [14], result [15];} in the copied code, % 02x indicates the format Controller: 'X' indicates hexadecimal output, '02' indicates less than two digits, and 0 is added before; for example, if 'F' outputs 0f and '1f3 ', 1f3 is output. Generally, this is the end of introduction. I would like to introduce more about the result array in the code, why is it [16]? This is because the MD5 algorithm generates 128 bits at last, while the minimum storage unit in the computer is byte, and the size of one byte is 8 bits, corresponding to a char type, the calculation requires 16 char. Therefore, the result is [16]. So why must the output format be % 02x instead of others. This is also caused by the following reasons: Since MD5 is generally output in hexadecimal format, this problem is actually converted to representing 128 zeros and 1 in hexadecimal format, each 4-bit binary corresponds to a hexadecimal element, and 32 hexadecimal elements are required. If all the elements are 0, they are placed in the char array. Normally, they are not output, for example, if 00001111 is output as % x and f is used, 0 is lost. If % 02x is used, the output result is 0f, which is the correct result of conversion. So the above are the origins of char [16] and % 02x. The 16-bit MD5 encryption is actually like this: for example, if the generated MD5 encryption string is 01234567abcdefababcdefab76543210, then the 16-bit MD encryption character is abcdefababcdefab, that is, it only intercepts the middle 16 bits. In fact, this operation is not included in the MD5 encryption algorithm, but should be a secondary processing of the MD5 encryption algorithm results. Other 64-bit and case-sensitive data are processed twice by the MD5 algorithm. The MD5 algorithm generates 128 bits and binary numbers. The above is my brief understanding of MD5 on 16-bit and 32-bit.