has introduced the statistical parameters of the estimate, the following describes another estimate, and compare the two.
For a set of samples, they are unconditionally independent. So considering the relationship between the joint distribution function and the edge distribution function, using the multiplication principle, we find that the joint distribution function of the sample is:
Discrete
Continuous
It is also found that they are distributed with the general:, then the continuous situation can be written:
There are unknown parameters in the above equation. To change the L to many unknown parameters for the element, we get:
A likelihood function called a sample.
The maximum likelihood estimate of a sample is called when the parameter of the sample that makes the likelihood function maximum is estimated.
As for how to ask, it is just a simple multivariate function evaluation.
It is found that L is the form of many products and is regular. Then we can generally use a method: The natural logarithm of the L, and then the partial derivative, so that it is 0, the solution of the parameter estimation.
Example
EX1: Overall, the maximum likelihood estimation is performed using the sample.
To 0, the maximum likelihood estimate of the parameter is obtained:
Maximum likelihood estimation of statistical parameters