It is known that a random sample satisfies a certain probability distribution, but the specific parameters are not clear. Parameter estimation is to observe the results through several experiments and derive the approximate values of parameters from the results.
Maximum likelihood estimation is based on the idea that a certain parameter is known to maximize the probability of the sample. Of course, we will not choose other small probability samples, so we simply take this parameter as the estimated true value.
Of course, the maximum likelihood estimation is only a rough mathematical expectation, and interval estimation is needed to know its error.
Extended data:
According to the maximum principle, if the maximum exists, the stagnation point obtained by this equation group is the maximum point, and the maximum likelihood estimation of the parameters can be obtained. The maximum likelihood estimation method generally belongs to this situation, so the maximum likelihood estimation can be obtained directly according to the above steps.
When looking for the moment estimator of parameters, it can not be used for the distribution where the total origin moment does not exist, such as Cauchy distribution. On the other hand, it only involves some numerical characteristics of the population, and does not use the distribution of the population.
Therefore, the moment estimator actually concentrates only a part of the information of the population, so it is often poor in reflecting the characteristics of the population distribution, and its Excellence can only be guaranteed if the sample size n is large. Therefore, in theory, the moment estimator is suitable for large samples.
Baidu encyclopedia-moment estimation
Baidu encyclopedia-maximum likelihood estimation