The second homework Huang only

Source: Internet
Author: User

1, reference book "Introduction to Data Compression (4th edition)" Page 66

3-2 Use the program Huff_enc to do the following (in each case, using the codebook generated by the compressed image).

(a) Encode Sena, Sensin, and Omaha images.

Answer: (a) Sena: Before compression: 64.0 KB (65,536 bytes)--After compression: 56.1 KB (57,503 bytes) Compression ratio: 88%

Sensin: Before compression: 64.0 KB (65,536 bytes)--After compression: 60.2 KB (61,649 bytes) Compression ratio: 94%

Omaha: Before compression: 64.0 KB (65,536 bytes)--After compression: 56.1 KB (57,503 bytes) Compression ratio: 89%

3-4 a source selects letters from the symbol set A={A1, A2, A3, A4, A5}, with a probability of P (A1) =0.15,p (A2) =0.04,p (A3) =0.26,p (A4) =0.05,p (A5) = 0.50.

(a) Calculate the entropy of the source.

(b) To seek the Huffman code of this source. 、

(c) Ask for the average length and redundancy of the code in (b).

Solution: (a) The entropy of this source is:

H=-P (A1) log2p (A1)-P (A2) log2p (A2)-P (A3) log2p (A3)-P (A4) log2p (A4)-P (A5) log2p (A5)

=-0.15 * LOG2 (0.15)-0.04 * LOG2 (0.04)-0.26 * LOG2 (0.26)-0.05 * LOG2 (0.05)-0.5 * LOG2 (0.5)

=0.411+0.1856+0.5044+0.216+0.5

=1.817 (BITS)

(b)

Answer: Hoffman code: A1 001

A2 0000

A3 01

A4 0001

A5 1

                      

(c)

L=p (A1) *l (A1) +p (A2) *l (A2) +p (A3) *l (A3) +p (A4) *l (A4) +p (A5) *l (A5)

=0.15*3+0.04*4+0.26*2+0.05*4+0.5*1

=1.83 (BITS)

Redundancy of h-l=2.368-1.83=0.538

2-6. There are several images and voice files in the accompanying data set in this book.

(a) Write a procedure to calculate the first-order entropy of some of the images and voice files.

(b) Select an image file and calculate its second-order entropy. Try to explain the difference between the first order entropy and the second entropy.

(c) For the image file used in (b), calculate the entropy of the difference of its neighboring pixels. Try to explain your findings.

For:

(a):

Filename First-order entropy Second order entropy Differential entropy
SENA. Img 6.834299 3.625204 3.856899
Sensin. Img 7.317944 4.301673 4.541547
OMAHA. Img 6.942426 4.488626 6.286834
EARTH. Img 4.770801 2.568358 3.962697
GABE. RAW 7.116338 6.654578 8.978236
BERK. RAW 7.151537 6.705169 8.976150

(b). Second-order entropy is smaller than first-degree entropy

(c). The entropy of the pixel difference is greater than the first order entropy and the second entropy.

The second homework Huang only

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.