Brief introduction:
GZipStream represents the gzip data format, which uses an industry-standard algorithm for lossless compression and decompression of files. This format includes a cyclic redundancy check value for detecting data corruption. The GZIP data format uses the same algorithm as the Deflatestream class, but it can be extended to use other compression formats. This format can be easily implemented in a way that does not involve patent rights.
The actual use of the network because of the transmission of large amounts of data, direct transmission simply can not tolerate, with gzipstream compression after the transfer of traffic immediately dropped by 80%, mainly because the ASCII text format has a relatively high compression rate so will be relatively high.
GZipStream located in System.IO.Compression
Compress code
Public byte[] Compress (byte[] io) { System.IO.MemoryStream basestream = new System.IO.MemoryStream (); using (System.IO.Compression.GZipStream Compressstream = new GZipStream (BaseStream, compressionmode.compress, True)) { Compressstream. Write (IO, 0, io. Length); Compressstream. Flush (); Compressstream. Close (); } BaseStream. Position = 0; Return BaseStream. GetBuffer (); }
Extract Code
Public System.IO.StringReader Decompress (byte[] str) { System.IO.MemoryStream stream = new System.IO.MemoryStream (); Stream. Write (str,0,str. Length); Stream. Position = 0; GZipStream zip = new GZipStream (stream, compressionmode.decompress); System.IO.StreamReader rd = new System.IO.StreamReader (Zip); return new System.IO.StringReader (Rd. ReadToEnd ()); }
In fact, only when you compress a large number of bytes will there be a significant compression rate, if you press a small amount of bytes instead of compression will be larger. Generally speaking 100+ more than a byte will have a good effect, less than this value does not need to compress.
I actually use the process of compression before the 3M compression after 50K, the effect is very obvious.
. NET GZipStream compression and decompression