The size of the compressed image is smaller than the specified Kb, and the size of the compressed image is smaller than kb.
This function is available for PS and QQ images. The image size is compressed to smaller than the specified kb.
So far, I have also come to shanzhai to control the image size. The usual solution is to control the resolution and quality.
Assuming that the size of the last compression is kb, the image quality is as high as possible while ensuring that the size is not greater than kb.The higher the image quality, the larger the image size.. However, there is no fixed formula for the relationship between size and quality, such as y = nx, and I have also tried to save the pictures saved by the win7 system ten times each, from 10 to 100, we can only draw the conclusion that the image quality is high and the occupied size is large.
In this case, we can only find the most suitable quality parameter that satisfies the size of 1 ookb. The binary method is used for searching.
/// <Summary> // compress the image to less than n Kb /// </summary> /// <param name = "img"> image </param> // /<param name = "format"> image format </param> /// <param name = "targetLen"> size after compression </param> /// <param name = "srcLen"> original size </param> // <returns> compressed Image memory stream </returns> public static MemoryStream Zip (Image img, imageFormat format, long targetLen, long srcLen = 0) {// you can specify the default value of 10kb const long nearlyLen = 10240; // The returned memory stream is used if the original image size is not transmitted. Memory stream read var MS = new MemoryStream (); if (0 = srcLen) {img. save (ms, format); srcLen = ms. length;} // The unit is converted from Kb to byte. if the target size is greater than the source image size, the targetLen * = 1024 is exited if (targetLen> = srcLen) {ms. setLength (0); ms. position = 0; img. save (MS, format); return MS;} // get the minimum value of the target size var exitLen = targetLen-nearlyLen; // initialize the quality of the compression parameters such as image memory stream var quality = (long) math. floor (100.00 * targetLen/srcLen); var parms = new Encode RParameters (1); // get the encoder information ImageCodecInfo formatInfo = null; var encoders = ImageCodecInfo. getImageEncoders (); foreach (ImageCodecInfo icf in encoders) {if (icf. formatID = format. guid) {formatInfo = icf; break ;}// use the binary method to find the closest quality parameter long startQuality = quality; long endQuality = 100; quality = (startQuality + endQuality) /2; while (true) {// set quality parms. param [0] = new EncoderParameter (System. Drawing. imaging. encoder. quality, quality); // clear the memory stream and save the image ms. setLength (0); ms. position = 0; img. save (ms, formatInfo, parms); // if the size after compression is smaller than the target size, exit if (ms. length> = exitLen & ms. length <= targetLen) {break;} else if (startQuality> = endQuality) // if the interval is equal, {break;} else if (ms. length <exitLen) // The compression size is too small. The start quality is shifted to {startQuality = quality;} else // The compression size is too large. The end quality is shifted to {endQuality = quality;} // You can reset the quality parameter. If the calculated quality does not change, the search is terminated. To avoid repeated computation, {start: 16, end: 18} and {start: 16, endQuality: 17} var newQuality = (startQuality + endQuality)/2; if (newQuality = quality) {break;} quality = newQuality; Console. writeLine ("start: {0} end: {1} current: {2}", startQuality, endQuality, quality) ;}return MS ;}
During the test, it is found that the processing time of each graph is almost 1 s. There is no problem in completing the requirement, but I always feel that there should be a more appropriate solution. If you have any friends, please kindly advise and discuss it together!