Current location - Training Enrollment Network - Mathematics courses - Discuss the importance of consistency of estimators.
Discuss the importance of consistency of estimators.
Significance: consistent estimation means that when the number of samples n tends to infinity, the value of estimator tends to the value of the parameter to be estimated.

That is to say, if this estimate is consistent, then when the sample size is large enough, the estimated value tends to the true value of the parameter. Unbiased estimation means that the mathematical expectation of the estimator is the parameter to be estimated, which means that unbiased estimation has no systematic error. When the sample is large, the consistency estimation reflects the nature of the estimator. Usually, the estimator we use is the consistent estimator.

Used to display a collection of evaluations.

The average difference of the estimated values of a single parameter. Imagine the following analogy: suppose that "parameters" are the bull's-eye of the target, "estimators" are the process of shooting arrows at the target, and each arrow is an "estimated value" (sample). Then, a high mean square error means that the average distance between each arrow and the bull's-eye is larger, and a low mean square error means that the average distance between each arrow and the bull's-eye is smaller.

Arrows may or may not converge. For example, even if all the arrows hit the same point, but seriously deviate from the target, the mean square error is still relatively large. However, it should be noted that if the mean square error is relatively small, the arrows are more likely to gather (rather than disperse).