Consistent Estimation | Estimation of Population Parameters
Probability Theory Mastering

Course Content

Probability Theory Mastering

# Consistent Estimation

In statistics, a consistent estimation is an estimation that converges to the true value of the parameter as the sample size increases, meaning that the estimation becomes more and more accurate as more data is collected. Formally it can be described as follows:

This definition may seem rather complicated. In addition, in practice, it is not always easy to check the consistency of an estimate in this way, that is why we will introduce a simpler applied criterion of consistency:

Thus, if our estimator is asymptotically unbiased or simply unbiased and the estimator's variance decreases with increasing sample size, then such an estimator is consistent.

Let's show that the estimates of the sample mean and adjusted sample variance are consistent.

## Sample mean estimation

The sample mean estimation is consistent by definition due to the law of large numbers: the more terms we include to calculate mean value, the closer the resulting value tends to the mathematical expectation.