Useful Properties of the Gaussian Distribution | Additional Statements From The Probability Theory
Probability Theory Mastering

Course Content

Probability Theory Mastering

## Probability Theory Mastering

1. Additional Statements From The Probability Theory
2. The Limit Theorems of Probability Theory
3. Estimation of Population Parameters
4. Testing of Statistical Hypotheses

# Useful Properties of the Gaussian Distribution

The Gaussian distribution (also called normal distribution) is one of the most important distributions in probability theory and statistics. Now we will look at some useful properties of this distribution and understand why it is so important and how it is applied in real life.

## Physical meaning of the Gaussian distribution

The Gaussian distribution can describe a random variable that results from many different factors adding up.

For example, when weighing something, various factors like temperature, pressure, and measurement errors affect the result. Individually, these factors don't matter much, but together they have a significant impact. This is explained further in the chapter on the Central Limit Theorem.

Let's see how we will denote the Gaussian quantities in the future:

## Linear transformations of Gaussian vectors

Gaussian distribution is preserved under linear transformations of random variables: if we apply a linear transformation to a Gaussian value, we will also get a Gaussian value at the output, but with different characteristics.

## Uncorrelated Gaussian variables are independent

We know that correlation shows only the presence of linear dependencies between variables: as a result variables can be dependent but not correlated. But in the case of Gaussian variables, zero correlation means that the variables are independent, which is also a very useful property of Gaussian distribution.

## 3-sigma rule

The 3-sigma rule, also known as the empirical rule or the 68-95-99.7 rule, is a statistical guideline that states that for a normal distribution:

• Approximately `68%` of the data falls within one standard deviation (`σ`) of the mean (`μ`);
• Approximately `95%` of the data falls within two standard deviations (`2σ`) of the mean (`μ`);
• Approximately `99.7%` of the data falls within three standard deviations (`3σ`) of the mean (`μ`). This rule can be very useful for detecting outliers for the data that has Gaussian distribution.

Suppose you have three interdependent Gaussian random variables: can we say that the sum of these three random variables will also be Gaussian?