Law of Large Numbers | The Limit Theorems of Probability Theory
Probability Theory Mastering

Course Content

Probability Theory Mastering

## Probability Theory Mastering

1. Additional Statements From The Probability Theory
2. The Limit Theorems of Probability Theory
3. Estimation of Population Parameters
4. Testing of Statistical Hypotheses

# Law of Large Numbers

The Law of Large Numbers is a fundamental concept in probability theory and statistics that states that as the sample size increases, the average of the observed values will converge to the expected value or mean of the underlying distribution.

## Mathematical definition of the law

Let's provide some explanations of this law:

1. The first condition is that we have a sequence of random variables that are independent and identically distributed (i.i.d.). This means the variables are the same type and have the same distribution pattern. For instance, N(1, 2) and N(1, 3) are not identically distributed because although they're both Gaussian, they have different variances;
2. The second condition is that these values must have a finite expectation. This means the series or integral must converge to a specific number, as discussed in Chapter 2 of the first section;
3. The law of large numbers states that if the first two conditions are met, then as we take more variables, the average of these variables gets closer to the real expectation.

Note

In the law's statement, you might see the letter 'p' above the arrow. This means convergence in terms of probability, which is how random variables come together. But for understanding the law of large numbers practically, you don't need to worry about this type of convergence. So, we won't go into it in this course.

## The visualization of the law

To verify if the law of large numbers holds true, run the code examples multiple times and observe if the convergence remains consistent when summing variables in different sequences. If the law is upheld, the average will consistently tend toward the actual expectation, regardless of the order in which the variables are summed.

We can see on the plot above the more terms we take, the closer the estimated value is to the real value: variance of the estimated value decreases.

Let's now look at the data that was obtained from the Cauchy distribution and see if the law of large numbers will work for this distribution (don't forget to run the code several times and look at the results):

In the first case, the plot always converges to zero regardless of the order of summation. The fluctuations around zero decrease as more terms are added.

However, in the second case, the plot doesn't converge and behaves unpredictably. This is because the Cauchy distribution lacks a finite mathematical expectation, violating the second condition of the law of large numbers.

What does the letter 'p' above the arrow mean in the definition of the law of large numbers?