What is entropy (Entropy)? _ Machine Learning

Source: Internet
Author: User
http://blog.csdn.net/ppn029012/article/details/8652047
What is entropy (Entropy)? Category: Heard machine study 2013-03-09 00:22 1430 people read comments (0) Collection report entropy information theory

Why can the concept of entropy (entropy) be cited in many different fields of science? (Physics, computer vision, information theory, etc.)

Here I'm going to talk about entropy in physics, and where the entropy in information theory is connected. To help understand and apply, finally there are several interesting examples of application of entropy.


Entropy came first in physics. German physicist Rudolph Kest Lausius first proposed the concept of entropy, which is used to indicate the uniformity of the distribution of energy in space, the more evenly the distribution of energy, the greater the entropy.

A drop of ink in the water, the department into a glass of light blue solution hot water in the air, the heat is passed to the air, and finally the temperature is consistent to notice that the process of changing the energy distribution is irreversible (you can't expect the blue molecules in the water to automatically gather into a drop of ink, and the soda in the air can automatically turn into boiling water), So the entropy of these systems is slowly increasing.
The second law of physics describes the changing laws of these irreversible processes in the universe and the irreversibility of all spontaneous processes in nature. Therefore, the increase of entropy is a very general concept, which shows that the development of the universe has directionality, that is, the direction of increasing the entropy. And these forces (laws) that drive things to increase in the direction of entropy are called Entropy forces. A few more examples of life:

An example of Entropy force is the headphone line, we put the headset line into the pocket, the next time we get out of the mess. Let the headset line out of the invisible "force" is the entropy force, headphone line like to become more confused. Another concrete example of the entropy force is the elastic force. The force of a spring is the entropy force.  Hooke's law is also an expression of entropy force. Gravitation is also a kind of entropy force (hotly discussed topic). Muddy water Clarification [1] the end of the cosmic development is that the entropy reaches the maximum, and all matter temperature reaches the heat balance. There is no longer any energy in such a universe that can sustain movement or life (the heat of death).

The above is the definition of entropy from the point of view of energy distribution, from the microscopic point of view, entropy represents the chaotic degree of the system (relative to the number of microscopic states, such as the energy level (E) of the particles can be used as the state). The state of all microscopic particles is only one, that is, the degree of Chaos is 0. And when 3 particles are in state 1,2,3 respectively. The entropy of this system is K*LN (3), in a word, the more the microscopic state, the more TM the entropy is.
So from the microscopic point of view, entropy shows the degree of uncertainty of the state of the system. Shannon, when describing an information system, borrowed the concept of entropy, where entropy represents the average amount of information in this information system (average degree of uncertainty). So when we say something, it will help you eliminate certain uncertainties, and the amount of information that is eliminated is how much. by calculating [2], the entropy of the commonly used Chinese characters is greater than the entropy of the English alphabet, so you say that the same length of Chinese is more likely than the meaning of the English language.
In addition to information theory, many places have borrowed this lovely concept.
1. (Information compression code) Hoffman (Huffman) code: The design of an information system, so that its entropy is the largest (the average amount of information encoded), so that the most efficient transmission.
2. (Computer vision) [3] In this article, we introduce the difference of entropy to detect a significant point of a picture. (e.g. a hand on the wall). The principle is that these are salient features of the point, are showing a strong uncertainty (entropy). And finding these points is likely to be what you want (salient features).

3. (Natural language Processing) in translation, a sentence J, there may be n kinds of translation (F1,F2...FN), and some knowledge Z (for example in this novel, f3,f4 the two translations are more likely). Now you have to build a model to describe the possibilities (probability distributions) of these n translations, and the best model is to make the probability distributions of these translations the most "entropy". The academic point is that, on the basis of the known, the most unpredictable (maximum entropy) inference can be made of the unknown. This is the idea of the maximum entropy model.

The interesting question is whether, in these systems, there will also be a system of increasing entropy, like the universe.
or by looking at certain entropy-increasing information systems (like language systems, entropy is always growing, because people always want to express some information faster and simpler), can you deduce that this is a change from an irreversible force? What is this power?

[1] http://www.douban.com/group/topic/11462628/[2] http://kilem3.wordpress.com/2006/09/11/%E4%BF%A1%E6%81%AF%E5% ad%a6%e4%b8%8a%e7%9a%84%e7%86%b5/
[3] Spatiotemporal Localization and categorization of Human Actions in unsegmented Image sequences

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.