1, independence is irrelevant, in turn, is not established.
2, covariance cov (x, y) =e (XY)-E (x) E (y) =e[(x-e (x)) (Y-e (y))]
3. Covariance only reflects the linear relationship between variables
4, correlation coefficient corr (x, Y) =cov (x, y)/(Var (X) var (Y))
5, |corr (x, y) |<=1, equals if and only if X and Y have strict linear relationship (note is linear relationship)
6, when (x, y) is a two-dimensional normal distribution, cov (x, y) =0 and Independent equivalence
7, large number theorem: When the sample quantity is large enough, the mean value of the sample converges to the average of the population. (i.e. frequency converges to probability)
8, the central limit theorem: (large number) independent of the same distribution of variables and obey the normal distribution (the variance is NE (x), the variance is Nvar (x))
9. Sample moment: The first-order Origin moment of the random variable X is e (x), and the second center moment is var (x); note: The second center moment of the sample m2=s2 (n-1)/n, where S2 represents the sample variance
10, point estimation: Estimate the distribution parameters with statistics, including: moment estimation, maximum likelihood estimation, Bayesian estimation
11, Moment estimation: The estimation of the total corresponding parameters by the estimation of the sample moment (note the estimate of variance using the second-order center moment of the corrected sample) "parameter is not necessarily expected and variance, such as evenly distributed parameters"
PS: Can be used low-order moment is not high-order, such as the Poisson distribution of the mean and variance is the same parameter, at this time should use the first order of the original point moment
12. Maximum likelihood estimation: Estimate the distribution parameters by the parameter values that make the sample appear the most likely; many of the results are the same as those of moment estimation; "Maximum likelihood estimation requires the form of parameter distribution"
Knowledge of Mathematical statistics