Understanding probabilities with the entity-Contact point of view:
Each variable is associated with an event, and the variable is dependent on the existence of the event, and two entities are one-to-two connections;
Each event is associated with a test, and the event is dependent on the existence of the experiment, and two entities are many-to-one;
The set of values for a variable is s, if a mapping table is defined on S, the mapping table satisfies the nature of the probability distribution, then the variable is called a random variable that defines the probability distribution;
Random variables and probability distributions are completely independent of the two entities that exist independently of each other. But the random variable can obey a probability distribution, and two entities are many-to-one relations;
Any entity can be information:
Given an entity E, if E makes a random variable subject to a different probability distribution, then the entity is called the information associated with the random variable;
Each probability distribution table contains a parameter, which is called the information entropy of the probability distribution, see the formula in detail;
If a random variable x obeys the probability distribution is p, if the information associated with x is given, x obeys a new probability distribution, the greater the difference between the information entropy of the new distribution and the information entropy corresponding to the old distribution, the greater the amount of information it contains.
Further explanations:
Every measurement or test, a series of events occur, and each occurrence of the event must exist in a certain test. Events that exist in different trials can be correlated, or may not be associated. Given two trials, if all the event pairs that occurred in two trials were not correlated, then the two trials were independent trials.
Example:
For example, we want to explore the direction in which the sun arises from this problem, first of all there must be an experiment: one day; an event: The sun rises in a certain direction; A variable: value East, West, south, north; a probability distribution: {East: 1, West: 0, South: 0, North: 0};
If you now give an entity, an article, or a string, or a picture, or a physics experiment, and so on any entity, its appearance makes the variable subject to another probability distribution: {East: 0.25, West: 0.25, South: 0.25, North: 0.25}, Then this entity is the information about the variable and it contains a very large amount.
Understanding of tests (measurements), events, random variables, values, probability distributions, information, information entropy