Bernuoli process and Poisson Process

Source: Internet
Author: User

There are two important processes in the random process: Arrival Process and Markov process;

I. Arrival Process: The arrival process focuses on the adjacent arrival time (that is, the time between two arrive) as independent random variable models. If the arrival time is discrete, the adjacent time follows the geometric distribution, that is, the bernuoli process. If the arrival time is continuous, the adjacent time follows the exponential distribution, that is, the Poisson process.

Ii. Markov process: Considering the evolution of data at time points, future data evolution and historical data have probability-related structures. For example, the future price of a stock is obviously dependent on the previous price. However, in the Markov process, we assume a special type of correlation: future data only depends on current data, but is irrelevant to past data.

1. bernuoli Process

(1 ). the bernuoli process is a series of independent bernuoli Random Variable Sequences x1, x2 ,...., XN, and for any I, P (xi = 1) = P (The I test was successful) = P; P (XI
= 0) = P (I-th test failed) = p-1;

In the process of arriving at a random, people are often interested in the total number of arrivals within a certain period of time, or the first arrival time. The random variables related to the bernuoli process and their properties:

①. The total number of arrivals within a period of time: or the total number of successful n successive independent tests K distribution, subject to the two distributions with parameters N and P.

②. The first arrival time: or the distribution of the total number of times t of the first successful bernuoli test, which is independent and repeated each other-obey the geometric distribution where the parameter is P.

(2) nature of the bernuoli process: independence and non-memory

①. Re-start: from any moment, we can also use the same bernuoli process for modeling in the future and be independent from the past. That is, for any given time n, random variable sequence X (n + 1), x (n + 2 ),... (the future of the process) is also the bernuli process, and with X1,
X2,..., xn (process past) Independent.

②. For any given time n, T is the first time after time n, then the random scalar T-N follows the geometric distribution of the parameter P, and with random variables X1 ,..., XN is independent.

(3). Adjacent arrival Interval

A very important random variable related to the bernuoli process is the time at which the k-th success (or arrival) is recorded as Y (k ). the related variable is the interval of the next arrival of the nth time, which is counted as T (k ). The so-called K adjacent arrival time is the total time required between the arrival of the K-1 to the K arrival, meet:

T1 = Y1, T2 = Y2-Y1,..., T (k)
= Y (k)-y (k-1 ).

①. The first successful time t1 follows the geometric distribution of the parameter P. After the time t1 is successful, it will be a new bernuoli process in the future. Based on the re-start principle, the number of lab times required for the next successful test is the same as that for T1 (the test times are equivalent to the time required for success ).
Previous experiments (including T1) are independent of future experiments (beginning with t1 + 1. Therefore, the random variables T1, T2,..., T (k) are independent and share the same geometric distribution.

②. An equivalent descriptive method for the bernuoli process:

I. Start with a string of random geometric distribution variables T1, T2,... with parameters P. They are adjacent arrival intervals;

II. The time when the observation succeeds (or arrives) is T1, t1 + T2,..., t1 + T2 +... + T (k) + ....

(4). the k-th arrival time

① The Time of the k-th arrival is equal to the sum of the first K adjacent arrival times

Y (K) = t1 + T2 +... + T (k)

In addition, T1,..., T (k) are independently distributed, and obey the geometric distribution where the parameter is P.

②. Y (k) expectations. The variance is

E [Y (k)] = E [T1] +... + E [T (k)] = K/P;

VaR [Y (k)] = var [T1] +... + var [T (k)] = K * (1-p)/P ^ 2;

③ Y (k) is a negative binary distribution, also known as Pascal distribution.

(Can we understand that the sum of the random variables that obey the geometric distribution (the new random variables) is subject to the Pascal Distribution ?)


2. Poisson Process

The Poisson process is the arrival process on the continuous timeline. Generally, when continuous time discretization cannot be performed in an application, a Poisson process is used to characterize the process.

In a period of time, the bernuoli process cannot clearly identify the number of accidents. In particular, it cannot calculate the average number of events in a given period of time. For example, it is very likely that two or more accidents occur within one minute...

Consider the continuous arrival process, that is, any real number T may be the arrival time, defined

P (K, T') = P (within the time range of T, K arrive );

(1). Poisson Process Definition

A process of reaching is called a Poisson process whose intensity is λ. If the process has the following properties:

①. (Time homogeneity) probability P (K, T') of K arrival is the same in the same time t' ②. (independence) the number of arrivals within a specific time period is independent from the historical arrivals in other time periods;

③ (Inter-cell probability) probability P (K, T') satisfies the following relationship:

P (0, T') = 1-λ t' + O (t '),

P (1, T') = λ t' + O1 (t '),

P (K, T') = OK (t'), k = 2, 3 ,...

We can see that for a small t', the probability of arrival at a time is approximately λ t', plus a negligible item. The probability of not arrival is roughly 1-λ t ', the probability of reaching two or more times is roughly negligible compared with P (1, T.

(2) Poisson process-related random variables and their distribution

①. Number of times of arrival within the interval: Obey the Poisson distribution with the parameter λ t. The intensity of this Poisson process is λ,
The distribution of the total number of times N (t') = K in the interval where the time length is.

E [n (t')] =
λ t', VAR [n (t')] = λ t'

②.
Probability law of the first arrival time T: Subject to the exponential distribution with the parameter λ.

E [T]
= 1/λ, VAR [T] = 1/λ ^ 2

Bernuoli process and Poisson process: (here the so-called "Arrival" refers to the occurrence of an event)


(3) independent property of Poisson Process

①. For any given time t> 0, the process after time t is also a Poisson process, and is independent from the historical process before time t (including t;

②. For any given time t, so that t' is the first arrival time after T, the random variable t'-T is subject to the exponential distribution with the parameter λ, and independent from the historical process before the time t (including T.

(4). Adjacent arrival time

There is a Poisson process starting from 0. The important random variable related to this process is the time when the K was successful (or reached), which is recorded as Y (k ). the variable closely related to Y (k) is the time of the next neighboring arrival of the K, which is counted as T (k). The satisfied relationship is as follows:

T1 = Y1, T2 = Y2-
Y1,..., T (K) = y (k)-y (k-1 ),...

T1, T2, T3,... are independent of each other.

Poisson process is an equivalent description method.

①. Start with an exponential random variable sequence T1, which is independent of each other and has a public parameter of λ,
T2,..., they are adjacent arrival time;

②. The arrival time of the process is T1, t1 + T2, t1 + T2 + T3,... such a random process is a Poisson process.

(5). the k-th arrival time

The k-th success time is the sum of K random variables that are independently and evenly distributed and subject to exponential distribution. Y (K) = t1 + T2 +... + T (k)

①. The time of the k-th arrival is equal to the sum of the first K adjacent arrival times, y (K) = t1 + T2 +... + T (K ),

And T1, T2,..., T (k) IID, obey the exponential distribution with the parameter λ.

②. The expectation and variance of Y (k) are

E [Y (k)] = E [T1] +... + E [T (k)] = K/λ

VaR [Y (k)] = var [T1] +... + var [T (k)] = K/λ ^ 2

③ The distribution density of Y (k) is Erlang.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.