Markov process:
The Markov process is divided into three kinds according to whether the state and time parameters are continuous or discrete: 1. The time and the state are all discrete called Markov chains, 2. Time and state are called Markov processes continuously, 3. Time continuous, state discrete Markov chain called continuous time.
The Markov process is characterized in that the state of the process in the T-time (T>T0) is related only to the moment T0 when the process is in a known condition in the state of the time T0, and it has nothing to do with the time before the process T0.
The first statement is that the formula P (n) =p (1) ^n represents the computed N-Step transfer probability matrix, rather than the N-step transfer probability of a point to another point.
The following is not true: Pij (n)!= Pij (1) ^n
Markov chain (Markov Chain)
Example: A particle walks randomly on [1,5], only at the moment N for the natural number of movement and stay at the 1,2,3,4,5 Five points, at the 2,3,4 point is the probability of 1/3 to the left, 2/3 of the probability to the right, at 5 points with a probability of 1 stay, at 1 points in probability 1 to move to 2.
One-Step transfer probability matrix: PIJ (1) The probability of reaching the J State from the state I through one step is as follows:
N-Step Transfer matrix: Pij (n) The probability of reaching the J State from the state I through n steps pij (n) =p (1) ^n
Theoretical basis:Chapman-kolmogorov equation (c-k formula)
According to the c-k equation, we know that the previous example of the two-step probability matrix is a*a: Matrix multiplication (MatLab test)
Similarly, n-Step transfer probability matrices
P (N) = P (n-1) p (1) = P (n-2) p (1) p (1) = ... = P (1) ^n