Markov chain and its steady state

Source: Internet
Author: User

The Markov chain is defined as follows

It is easy to see from the definition that the current state of the Markov chain is only relevant to the previous state. For example, we forecast the weather tomorrow, only consider today's weather conditions, regardless of yesterday's weather conditions.

Give a concrete example. Sociologists divide people according to their economic conditions into 3 categories: lower, middle, upper, and we use three-to-one to denote these three classes. Sociologists have found that the most important factor in determining a person's income class is the income class of their parents. If a person's income belongs to the lower class, then the probability of its child belonging to the lower income is 0.65, the probability of middle income is 0.28, the probability of belonging to the upper income is 0.07. From the parent to the descendant, the income class transfer probability is as follows

We use p to represent this transfer matrix, then

Assuming that the 1th generation of the class ratio is , the first 10 generations of the class distribution is as follows

We can see that, under the same transfer matrix, the state changes will eventually stabilize. For the nth generation of the class distribution, we have . From the expression we can see that π is a one-dimensional vector, p is a two-dimensional matrix, p after enough multiple squared, the value tends to stabilize.

In the stationary state of the transfer matrix p, we call it the stable distribution of the Markov chain. For this feature, there are the following wonderful theorems

I'm here to explain the above theorem intuitively.

Conditions

(1) Non-periodic Markov chain: Markov chain transfer to converge, it must not be cyclical. Without special treatment, the problems we deal with are mostly non-cyclical and do not make redundant explanations here.

(2) There is a probability transfer matrix p, any two states are connected: Here the connection can not be directly connected, as long as the limited transfer can be reached. For example, for a, B, C states, there are a->b, b->c, then we think a to C is up to.

Conclusion

(1) No matter what the initial state is, there is a steady state of π after enough multiple probabilities to transfer.

(2) The probability transfer matrix is squared enough multiple times for each row to be of equal value. That

due to arbitrary initial probability vectors , there are equal. When rows are equal, the value is clearly fixed. (The sum of the components is 1)

(3). Obviously, due to the stability of the Markov chain, the probability of all States being transferred to state J is stable.

(4) to π=, then π is the stable state of the Markov chain, and π is the only nonnegative solution to the π=πp of the equation. Combined with the above conclusions, it is obvious.

We will use a simpler example to illustrate the physical meaning of this theorem. In the process of urbanization, the probability of rural people transferring to urban people is 0.5, and the probability of urban people transferring to rural people is 0.1.

 

Rural people

City people

Rural people

0.5

0.5

City people

0.1

0.9

Suppose that at first there were 100 rural people, 0 urban people, and the number of transfers per generation was as follows

Algebra

Rural people

City people

Rural people transfer to urban people

Urban people transfer to rural people

1

100

0

50

0

2

50

50

25

5

3

30

70

15

7

4

22

78

11

8

5

19

81

10

8

6

17

83

8

8

7

17

83

8

8

It can be seen that the steady state of Markov in the process of urbanization is the speed at which rural people transfer to urban people and the speed of urban people to rural people. For the above transfer matrix p, the stable distribution of rural people 17%, urban people 83%. If we can get the current China urbanization transfer matrix p, we can figure out how much the final urbanization rate in China is (not considering p changes here). And if we know the proportion of people who are urbanized in China, we can see how many generations of urbanization can continue.

Markov chain and its steady state

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.