Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Linear Regression with Two Features | Section
Supervised Learning Essentials

bookLinear Regression with Two Features

So far, we have looked at linear regression with only one feature. It is called simple linear regression. But in reality, most of the time target depends on multiple features. Linear regression with more than one feature is called Multiple Linear Regression

Two-Feature Linear Regression Equation

In our example with heights, adding the mother's height as a feature to the model would likely improve our predictions. But how do we add a new feature to the model? An equation defines linear regression, so we just need to add a new feature to an equation:

ypred=Ξ²0+Ξ²1x1+Ξ²2x2y_{\text{pred}} = \beta_0 + \beta_1 x_1 + \beta_2 x_2

Where:

  • Ξ²0,Ξ²1,Ξ²2\beta_0, \beta_1, \beta_2 – are the model's parameters;
  • ypredy_{\text{pred}} – is the prediction of a target;
  • x1x_1 – is the first feature value;
  • x2x_2 – is the second feature value.

Visualization

When we discussed the simple regression model, we built the 2D plot where one axis is the feature, and the other is the target. Now that we have two features, we need two axes for features and the third one for the target. So we are moving from a 2D space to a 3D one, which is much harder to visualize. The video shows a 3D scatterplot of the dataset in our example.

But now, our equation is not an equation of a line. It is an equation of a plane. Here is a scatterplot along with the predicted plane.

You may have noticed that mathematically our equation hasn't become much harder. But unfortunately, the visualization has.

question mark

What best describes multiple linear regression

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 1. ChapterΒ 5

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

bookLinear Regression with Two Features

Swipe to show menu

So far, we have looked at linear regression with only one feature. It is called simple linear regression. But in reality, most of the time target depends on multiple features. Linear regression with more than one feature is called Multiple Linear Regression

Two-Feature Linear Regression Equation

In our example with heights, adding the mother's height as a feature to the model would likely improve our predictions. But how do we add a new feature to the model? An equation defines linear regression, so we just need to add a new feature to an equation:

ypred=Ξ²0+Ξ²1x1+Ξ²2x2y_{\text{pred}} = \beta_0 + \beta_1 x_1 + \beta_2 x_2

Where:

  • Ξ²0,Ξ²1,Ξ²2\beta_0, \beta_1, \beta_2 – are the model's parameters;
  • ypredy_{\text{pred}} – is the prediction of a target;
  • x1x_1 – is the first feature value;
  • x2x_2 – is the second feature value.

Visualization

When we discussed the simple regression model, we built the 2D plot where one axis is the feature, and the other is the target. Now that we have two features, we need two axes for features and the third one for the target. So we are moving from a 2D space to a 3D one, which is much harder to visualize. The video shows a 3D scatterplot of the dataset in our example.

But now, our equation is not an equation of a line. It is an equation of a plane. Here is a scatterplot along with the predicted plane.

You may have noticed that mathematically our equation hasn't become much harder. But unfortunately, the visualization has.

question mark

What best describes multiple linear regression

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 1. ChapterΒ 5
some-alt