How to Use the zlib module for Data Compression in Python

Source: Internet
Author: User

How to Use the zlib module for Data Compression in Python

This article describes how to use the zlib module to compress data in Python. It is a basic knowledge of getting started with Python. For more information, see

In the Python standard module, multiple modules are used for data compression and decompression, such as zipfile, gzip, and bz2. I introduced the zipfile module last time. Today I will talk about the zlib module.

Zlib. compress (string [, level])
Zlib. decompress (string [, wbits [, bufsize])

Zlib. compress is used to compress streaming data. The string parameter specifies the data stream to be compressed, and the level parameter specifies the compression level. The value range is 1 to 9. The compression speed is inversely proportional to the compression rate. 1 indicates the fastest compression speed and the lowest compression rate. 9 indicates the slowest compression speed but the highest compression rate. Zlib. decompress is used to extract data. The string parameter specifies the data to be extracted. wbits and bufsize are used to set the system buffer size (window buffer) and output buffer size (output buffer) respectively ). The following example shows how to use these two methods:

?

19 # coding = gbk

Import zlib, urllib

Fp = urllib. urlopen ('HTTP: // localhost/default.html ')

Str = fp. read ()

Fp. close ()

# ---- Compress data streams.

Str1 = zlib. compress (str, zlib. Z_BEST_COMPRESSION)

Str2 = zlib. decompress (str1)

Print len (str)

Print len (str1)

Print len (str2)

# ---- Result

#5783

#1531

#5783

We can also use Compress/Decompress objects to Compress/Decompress data. Zlib. compressobj ([level]) and zlib. decompress (string [, wbits [, bufsize]) Create Compress/Decompress object respectively. The usage of data compression and decompression through objects is very similar to zlib. compress and zlib. decompress described above. However, there is a difference between the two in data compression, which is mainly reflected in the case of a large amount of data operations. If we want to compress a very large data file (hundreds of MB), if we use zlib. to compress data, you must first read the data in the file to the memory and then compress the data. In this way, too much memory will be used. If the object is used for compression, there is no need to read all the data of the file at a time. You can first read a part of the data and compress it into the memory. After compression, the data is written to the file, then, read the data from other parts and compress the data. This loop repeats and only compresses the entire file. The following example demonstrates the differences:

?

 

66 # coding = gbk

Import zlib, urllib

Fp = urllib. urlopen ('HTTP: // localhost/default.html ')

# The accessed URL.

Data = fp. read ()

Fp. close ()

# ---- Compressing data streams

Str1 = zlib. compress (data, zlib. Z_BEST_COMPRESSION)

Str2 = zlib. decompress (str1)

Print 'raw data Length: ', len (data)

Print '-' * 30

Print 'zlib. compress: ', len (str1)

Print 'zlib. decompress: ', len (str2)

Print '-' * 30

# ---- Use Compress and Decompress objects to Compress/Decompress data streams

Com_obj = zlib. compressobj (zlib. Z_BEST_COMPRESSION)

Decom_obj = zlib. decompressobj ()

Str_obj = com_obj.compress (data)

Str_obj + = com_obj.flush ()

Print 'compress. Compress after compression: ', len (str_obj)

Str_obj1 = decom_obj.decompress (str_obj)

Str_obj1 + = decom_obj.flush ()

Print 'decompress Decompress. decompress: ', len (str_obj1)

Print '-' * 30

# ---- Use Compress and Decompress objects to Compress/Decompress data in blocks.

Com_obj1 = zlib. compressobj (zlib. Z_BEST_COMPRESSION)

Decom_obj1 = zlib. decompressobj ()

Chunk_size = 30;

# Raw data blocks

Str_chunks = [data [I * chunk_size :( I + 1) * chunk_size]/

For I in range (len (data) + chunk_size)/chunk_size)]

Str_obj2 =''

For chunk in str_chunks:

Str_obj2 + = com_obj1.compress (chunk)

Str_obj2 + = com_obj1.flush ()

Print 'block compression: ', len (str_obj2)

# Decompress data in blocks

Str_chunks = [str_obj2 [I * chunk_size :( I + 1) * chunk_size]/

For I in range (len (str_obj2) + chunk_size)/chunk_size)]

Str_obj2 =''

For chunk in str_chunks:

Str_obj2 + = decom_obj1.decompress (chunk)

Str_obj2 + = decom_obj1.flush ()

Print 'decompress the block: ', len (str_obj2)

# ---- Result ------------------------

Raw Data Length: 5783

------------------------------

After zlib. compress compression: 1531

After zlib. decompress is decompressed: 5783

------------------------------

Compress. compress after compression: 1531

After Decompress. decompress is decompressed: 5783

------------------------------

After compression: 1531

Decompress the parts: 5783

The Python manual describes the zlib module in details. For more specific applications, refer to the Python manual.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.