§ 3. 5 Distribution of multidimensional Random Variable Functions
This section is very important. Generally, the probability statistics test must have these questions.
In particular, the distribution of values 1, 3, 4, max (x, y), and min (x, y) in this section is representative.
I. probability distribution of functions of discrete random variables (x, y)
Example 1: the distribution law of known (X, Y) is:
X Y |
-1 1 2 |
-1 2 |
5/20 2/20 6/20 3/20 3/20 1/20 |
Result: z1 = x + y, Z2 = max (x, y) Distribution Law.
P |
5/20 |
2/20 |
6/20 |
3/20 |
3/20 |
1/20 |
(X, Y) Values |
(-1,-1) |
(-1, 1) |
(-1, 2) |
(2,-1) |
(2, 1) |
(2, 2) |
Z1 Value |
-2 |
0 |
1 |
1 |
3 |
4 |
Z2 Value |
-1 |
1 |
2 |
2 |
2 |
2 |
Ii. probability distribution of functions of continuous random variables (x, y)
- Known (x, y )~ F (x, y), returns the probability density of Z = g (x, y.
- Z ~ FZ (z) = P (Z £z) = P {g (x, y) £z} =,
- Z ~ FZ (z) = f'z (z)
2. Known (x, y )~ F (x, y), calculate the probability density of Z = x + y
Theorem 3.4 If the joint probability density of (x, y) is f (x, y), then the probability density of Z = x + y is
FZ (z) =
Or FZ (z) =.
Proof: P85---86.
P85 ....
It is inferred that if X and Y are independent of each other and their probability density is FX (X) and FY (y) respectively, the probability density of independence and Z = x + y is
FZ (z) = (3.36)
Or, Fz (z) = (3.37)
In example 1, P86 assumes that X and Y are two independent random variables. They all obey N (0, 1), that is ..
Evaluate the probability density of Z = x + y.
Generally, X and Y are independent of each other,
, Then z = x + y still follows the normal distribution and has. This conclusion can be generalized to the sum of N independent normal random variables. That is, if
And they are independent of each other, then their and are still subject to normal distribution, and there are
Generally, it can be proved that the linear combination of finite independent normal random variables still follows the normal distribution.
Example 2 p87
Example 3 p88
In Example 3, the sum of N independent distribution variables is still subject to distribution.
Example (first version): Set R. V. X and Y to be independent of each other ~ F (x) =, Y ~ F (y) =, evaluate the distribution density function of Z = x + y.
Example: (example 3.17 In the book) X and Y are known to be independent of each other, all of which are subject to n () and calculate the probability density of Z = x + y.
Example (first version): (example 3.18 In the book) where X and Y are independent of each other, their probability density is FX () =, Fy (y) =, evaluate the probability density of Z = x + y.
3. Probability Distribution of M = max (x, y), n = min (x, y)
Theorem 3.6 if X and Y are independent of each other and their distribution functions are FX (X) and FY (Y), then
- The distribution function of M = max (X, Y) is: FM (z) = FX (z) · Fy (z)
- The distribution function of N = min (X, Y) is:
FN (z) = 1-(1-Fx (z) · (1-fy (z ))
The conclusion here is very important.
It can be promoted to a more general situation. For example ....
Theorem 3.7 x1, x2 ,... And XN are independent of each other.
- If J1 (X1), J2 (X2 ),... And JN (Xn) are x1, x2 ,... ,... And JN are independent of each other.
- If J is x1, x2 ,... , K random variables xi1, xi2,… in XN ,... And xik functions. Y is the other M random variables XJ1, xj2 ,... And xjl functions, J and Y are independent of each other.