LZO usage and introduction
LZO description Summary
LZO is a lossless compression library written in ansi c. He can provide very fast compression and decompression functions. Decompression does not require memory support. Even if a large compression ratio is used to compress the data slowly, the data can still be decompressed very quickly. LZO follows the gnu gpl license.
Introduction
LZO is very suitable for real-time data compression and decompression, that is, it is more concerned with the operation speed, rather than the compression ratio.
LZO is written in ansi c, and the compressed data is also designed as a cross-platform format.
LZO has the following features:
- Quick and easy to decompress;
- No memory support is required during decompression;
- The compression speed is good;
- Only 64 KiB memory is needed for compression;
- The compression ratio can be adjusted as needed, which does not affect the decompression efficiency. Increasing the compression ratio naturally reduces the compression speed;
- Compression includes many compression levels and provides many options;
- Provides a compression level that only supports 8 KiB memory;
- Provides thread security;
- Provide lossless compression;
LZO provides multiple compression and recovery and decompression;
Design Standards
LZO is designed based on processing speed. The decompression speed is faster than the compression speed. Provides real-time decompression for any program. The LZO1X decompression is optimized according to the i386 encoding.
In fact, the structure is verified by extracting the compressed data structure defined by the algorithm and manually testing the data.
Effect
On the ancient Pentium 133 device, we ran the test data for Calgary Corpus. The data size is 256 KiB.
LZOxx-N defines the name of the algorithm used. N indicates the compression level. Level 1-9 uses 64 KiB memory, which mainly provides faster compression speed. Level 99 uses 256 KiB memory to provide a larger compression ratio, but the processing speed is still very fast. The 999 level is an algorithm optimized based on the compression ratio. The compression speed is very slow and a large amount of memory is used. This level is generally used to generate pre-compressed data.
C-language version of the LZO1X-1 algorithm than the fastest ZLIB compression level, 4-5 times faster. Of course, he is also in the compression ratio, compression time, extraction time is easier than LZRW1-A and LZV these algorithms.
+------------------------------------------------------------------------+ | Algorithm Length CxB ComLen %Remn Bits Com K/s Dec K/s | | --------- ------ --- ------ ----- ---- ------- ------- | | | | memcpy() 224401 1 224401 100.0 8.00 60956.83 59124.58 | | | | LZO1-1 224401 1 117362 53.1 4.25 4665.24 13341.98 | | LZO1-99 224401 1 101560 46.7 3.73 1373.29 13823.40 | | | | LZO1A-1 224401 1 115174 51.7 4.14 4937.83 14410.35 | | LZO1A-99 224401 1 99958 45.5 3.64 1362.72 14734.17 | | | | LZO1B-1 224401 1 109590 49.6 3.97 4565.53 15438.34 | | LZO1B-2 224401 1 106235 48.4 3.88 4297.33 15492.79 | | LZO1B-3 224401 1 104395 47.8 3.83 4018.21 15373.52 | | LZO1B-4 224401 1 104828 47.4 3.79 3024.48 15100.11 | | LZO1B-5 224401 1 102724 46.7 3.73 2827.82 15427.62 | | LZO1B-6 224401 1 101210 46.0 3.68 2615.96 15325.68 | | LZO1B-7 224401 1 101388 46.0 3.68 2430.89 15361.47 | | LZO1B-8 224401 1 99453 45.2 3.62 2183.87 15402.77 | | LZO1B-9 224401 1 99118 45.0 3.60 1677.06 15069.60 | | LZO1B-99 224401 1 95399 43.6 3.48 1286.87 15656.11 | | LZO1B-999 224401 1 83934 39.1 3.13 232.40 16445.05 | | | | LZO1C-1 224401 1 111735 50.4 4.03 4883.08 15570.91 | | LZO1C-2 224401 1 108652 49.3 3.94 4424.24 15733.14 | | LZO1C-3 224401 1 106810 48.7 3.89 4127.65 15645.69 | | LZO1C-4 224401 1 105717 47.7 3.82 3007.92 15346.44 | | LZO1C-5 224401 1 103605 47.0 3.76 2829.15 15153.88 | | LZO1C-6 224401 1 102585 46.5 3.72 2631.37 15257.58 | | LZO1C-7 224401 1 101937 46.2 3.70 2378.57 15492.49 | | LZO1C-8 224401 1 100779 45.6 3.65 2171.93 15386.07 | | LZO1C-9 224401 1 100255 45.4 3.63 1691.44 15194.68 | | LZO1C-99 224401 1 97252 44.1 3.53 1462.88 15341.37 | | LZO1C-999 224401 1 87740 40.2 3.21 306.44 16411.94 | | | | LZO1F-1 224401 1 113412 50.8 4.07 4755.97 16074.12 | | LZO1F-999 224401 1 89599 40.3 3.23 280.68 16553.90 | | | | LZO1X-1(11) 224401 1 118810 52.6 4.21 4544.42 15879.04 | | LZO1X-1(12) 224401 1 113675 50.6 4.05 4411.15 15721.59 | | LZO1X-1 224401 1 109323 49.4 3.95 4991.76 15584.89 | | LZO1X-1(15) 224401 1 108500 49.1 3.93 5077.50 15744.56 | | LZO1X-999 224401 1 82854 38.0 3.04 135.77 16548.48 | | | | LZO1Y-1 224401 1 110820 49.8 3.98 4952.52 15638.82 | | LZO1Y-999 224401 1 83614 38.2 3.05 135.07 16385.40 | | | | LZO1Z-999 224401 1 83034 38.0 3.04 133.31 10553.74 | | | | LZO2A-999 224401 1 87880 40.0 3.20 301.21 8115.75 | +------------------------------------------------------------------------+
Note:
- CxB indicates the number of blocks;
- K/s is the speed at which 1 kb of data is not compressed per second;
- The compilation decompression speed will be faster;
Simple document
LZO is a block compression algorithm that compresses and decompresses a block data. The size of the compressed block and decompressed block must be the same.
LZO compresses block data into matched data (slide dictionary) and non-matched text sequences. LZO provides special processing for long matching and long text sequences, which can achieve good results for highly redundant data. This can also produce better results for non-compressed data.
When processing non-compressed data, LZO will expand the input data, each 1024 bytes can be increased by a maximum of 64 bytes.
I have verified the LZO program through a memory check tool like valgrind, and used more than bytes of data for various parameter adjustments and checked for various potential problems. LZO currently does not have any known bugs.
Various Algorithms
Many algorithms are implemented here, but I want to provide unlimited downward compatibility, so the existing content will not be removed in the future.
Just as the object files are relatively independent, if you use the LZO library through static links, only a small amount of capacity (about several KiB) will be increased ), this module is loaded only when the LZO function is actually used.
On February 1, March 1996, I published LZO1 and LZO1A at newsgroups. comp. compression and comp. compression. research. They mainly deal with compatibility issues. The decompression speed of LZO2A is very slow, and it does not provide a fast compression speed.
In my experiments, we can see that LZO1B is suitable for processing large amounts of data or data with high redundancy. LZO1F is suitable for processing small amounts of data and binary data. LZO1X is suitable for various environments. LZO1Y and LZO1Z are very similar to LZO1X. They can achieve a better compression ratio in some environments.
Note that you have many options.
Use the LZO Library
Regardless of the data size, the basic functions of the LZO library are very simple. Let's assume you're planning to process your data using LZO1X-1 algorithms.
Compression
1 include <lzo/lzo1x. h> 2 call lzo_init () 3 compress your data through lzo1x_compress () 4 compile and connect to the LZO Library
Extract
1 include <lzo/lzo1x. h> 2 call lzo_init () 3 decompress data through lzo1x_decompress () 4 compile and connect to the LZO Library
Complete sample code is provided in examples/simple. c of the source code package. You can also view the LZO. FAQ for more information.
Original file:
- Http://www.oberhumer.com/opensource/lzo/lzodoc.php
Original lzo faq file:
- Http://www.oberhumer.com/opensource/lzo/lzofaq.php