The measurement and function of information in the 6th chapter of the Beauty of mathematics

Source: Internet
Author: User
Tags relative
1 Information Entropy

The amount of information in a message has a direct relationship with its uncertainty. The amount of information is equal to the number of uncertainties.

How to quantify the measurement of information. The information entropy (entropy) is denoted by the symbol H, which is the unit bit.


The greater the uncertainty of the variable, the greater the entropy.

If a book repeats a lot of content, its information is small, redundancy is large.

The redundancy of different languages varies greatly, while Chinese is relatively small in all languages.
2 Function of the information

Information and the elimination of uncertainties are interlinked.

Information is the only way to eliminate system uncertainties (before any information is obtained, a system is like a black box that introduces information to understand the internal structure of the black box system)

The essence of Web search is also the process of using information to eliminate uncertainty.

Rational use of information, rather than playing with formulas and machine learning algorithms, is the key to a good search.

Conditional entropy:


, which means that the uncertainty about X is reduced after the information of Y is more. In the statistical language model, if Y is regarded as the previous word, then mathematically proves that the uncertainty of the two-dollar model is less than the one-element model.

When the above equals sign is established. The equals sign, which indicates an increase in information, does not diminish the uncertainty. If we or the lack of information has nothing to do with the things to be studied, the equals sign is established.

The role of information is to eliminate uncertainty, and a large number of natural language processing problems are finding relevant information.
3 Mutual Information

Shannon proposes a "mutual information" concept as a quantitative measure of the "relevance" of two random events.


The quantitative measure of the so-called two event correlation is the amount of information provided to eliminate another x's uncertainty, given the understanding of one of the Y's.
4 Relative Entropy

Relative entropy is also used to measure correlations, but unlike variable mutual information, it is used to measure the correlation of two of functions with a positive value.

Two identical functions, their relative entropy equals zero.

The greater the relative entropy, the greater the difference between the two functions; conversely, the smaller the relative entropy, the smaller the difference between the two functions.

For the probability distribution or the probability density function, if the value is greater than 0, the relative entropy can measure the difference of two random distributions.



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.