In mathematical statistics, the total x we have seen is generally unknown. Even if, according to previous experience and data, X obeys that type of distribution, its numerical characteristics (such as mathematical expectation, variance, moment) are unknown. These unknown numeric characteristics and the unknowns contained in the total x are called unknown parameters (parameters). In order to estimate the true value or the interval of the unknown function, we need to extract the sample from the total X and then construct some statistic with the sample to estimate the values of unknown parameters or their ranges. This method is called parameter estimation. The point estimation is to estimate the real parameter values of the population (parameter Truth) based on a statistic (called an estimate) constructed by the sample. For example, sinzing large number theorem, set X1,X2, is independent of the same distribution of random variable sequence, and their expectations exist, recorded as E (Xi) =μ (i=1,2,?), then for any of the >0, there is
As an approximation of a, and as the n increases, the error between a and a becomes smaller.
Estimates and estimates are collectively referred to as estimates
The constructed statistic is called the Point estimator, and the resulting estimate is called the point estimate. Therefore, the estimated value of the non-identical sample is different.
Whether we can use the statistical quantity of sample structure as an estimate of unknown parameters requires some rationality and theoretical basis.
Here are two common ways to construct statistics:
(1) Moment estimation method
(2) Maximum likelihood estimation method
Moment Estimation method
The thought of moment estimation method is to approximate the whole order moment by using each order moment of the sample.
Parameter estimation (i) Point estimation moment Estimation method (1)