Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Finding the Parameters | Simple Linear Regression
course content

Зміст курсу

Linear Regression with Python

Finding the ParametersFinding the Parameters

We now know that Linear Regression is just a line that best fits data. But how can you tell which is the right one?

Well, you can calculate the difference between the predicted value and the actual target value for each data point in the training set.
These differences are called residuals(or errors). And the goal is to make the residuals as small as possible.

Ordinary Least Squares

The default approach is the Ordinary Least Squares(OLS) method:
Take each residual, square it (mainly to eliminate the sign of a residual), and sum all of them.
That is called SSR(Sum of squared residuals). And the task is to find the parameters that minimize the SSR.

Normal Equation

Fortunately, we do not need to try all the lines and calculate SSR for them. The task of minimizing SSR has a mathematical solution that is not very computationally expensive.
This solution is called the Normal Equation.

This equation gives us the parameters of a line with the least SSR.
Did you not understand how it works? No worries! It is pretty complex maths. But you don't have to calculate the parameters with your hands. Many libraries have already implemented linear regression.

So hop into the following chapters. They will show you how to build the linear regression model using those libraries.

Quiz

1. Consider the image above. Which regression line is better?
2. y_true - y_predicted is called

Consider the image above. Which regression line is better?

Виберіть правильну відповідь

y_true - y_predicted is called

Виберіть правильну відповідь

Все було зрозуміло?

Секція 1. Розділ 2
course content

Зміст курсу

Linear Regression with Python

Finding the ParametersFinding the Parameters

We now know that Linear Regression is just a line that best fits data. But how can you tell which is the right one?

Well, you can calculate the difference between the predicted value and the actual target value for each data point in the training set.
These differences are called residuals(or errors). And the goal is to make the residuals as small as possible.

Ordinary Least Squares

The default approach is the Ordinary Least Squares(OLS) method:
Take each residual, square it (mainly to eliminate the sign of a residual), and sum all of them.
That is called SSR(Sum of squared residuals). And the task is to find the parameters that minimize the SSR.

Normal Equation

Fortunately, we do not need to try all the lines and calculate SSR for them. The task of minimizing SSR has a mathematical solution that is not very computationally expensive.
This solution is called the Normal Equation.

This equation gives us the parameters of a line with the least SSR.
Did you not understand how it works? No worries! It is pretty complex maths. But you don't have to calculate the parameters with your hands. Many libraries have already implemented linear regression.

So hop into the following chapters. They will show you how to build the linear regression model using those libraries.

Quiz

1. Consider the image above. Which regression line is better?
2. y_true - y_predicted is called

Consider the image above. Which regression line is better?

Виберіть правильну відповідь

y_true - y_predicted is called

Виберіть правильну відповідь

Все було зрозуміло?

Секція 1. Розділ 2
some-alt