Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Quadratic Regression | Section
Practice
Projects
Quizzes & Challenges
Quizzes
Challenges
/
Supervised Learning Essentials

bookQuadratic Regression

The Problem with Linear Regression

Before defining the Polynomial Regression, we'll take a look at the case that the Linear Regression we have learned before doesn't handle well.

Here you can see that our simple linear regression model is doing awful. That is because it tries to fit a straight line to the data points. Yet we can notice that fitting a parabola would be a much better choice for our points.

Quadratic Regression Equation

To build a straight-line model, we used an equation of a line (y=ax+b). So to build a parabolic model, we need the equation of a parabola. That is the quadratic equation: y=ax2+bx+cy=axΒ²+bx+c. Changing the aa, bb, and cc to Ξ²Ξ² would give us the Quadratic Regression Equation:

ypred=Ξ²0+Ξ²1x+Ξ²2x2y_{\text{pred}} = \beta_0 + \beta_1 x + \beta_2 x^2

Where:

  • Ξ²0,Ξ²1,Ξ²2\beta_0, \beta_1, \beta_2 – are the model's parameters;
  • ypredy_{\text{pred}} – is the prediction of a target;
  • xx – is the feature value.

The model this equation describes is called Quadratic Regression. Like before, we only need to find the best parameters for our data points.

Normal Equation and X̃

As always, the Normal Equation handles finding the best parameters. But we need to define the X̃ properly.

We already know how to build the X̃ matrix for Multiple Linear Regression. It turns out the X̃ matrix for Polynomial Regression is constructed similarly. We can think of x² as a second feature. This way, we need to add a corresponding new column to the X̃. It will hold the same values as the previous column but squared.

The video below shows how to build the X̃.

question mark

What is the main limitation of linear regression when modeling data?

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 1. ChapterΒ 10

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

bookQuadratic Regression

Swipe to show menu

The Problem with Linear Regression

Before defining the Polynomial Regression, we'll take a look at the case that the Linear Regression we have learned before doesn't handle well.

Here you can see that our simple linear regression model is doing awful. That is because it tries to fit a straight line to the data points. Yet we can notice that fitting a parabola would be a much better choice for our points.

Quadratic Regression Equation

To build a straight-line model, we used an equation of a line (y=ax+b). So to build a parabolic model, we need the equation of a parabola. That is the quadratic equation: y=ax2+bx+cy=axΒ²+bx+c. Changing the aa, bb, and cc to Ξ²Ξ² would give us the Quadratic Regression Equation:

ypred=Ξ²0+Ξ²1x+Ξ²2x2y_{\text{pred}} = \beta_0 + \beta_1 x + \beta_2 x^2

Where:

  • Ξ²0,Ξ²1,Ξ²2\beta_0, \beta_1, \beta_2 – are the model's parameters;
  • ypredy_{\text{pred}} – is the prediction of a target;
  • xx – is the feature value.

The model this equation describes is called Quadratic Regression. Like before, we only need to find the best parameters for our data points.

Normal Equation and X̃

As always, the Normal Equation handles finding the best parameters. But we need to define the X̃ properly.

We already know how to build the X̃ matrix for Multiple Linear Regression. It turns out the X̃ matrix for Polynomial Regression is constructed similarly. We can think of x² as a second feature. This way, we need to add a corresponding new column to the X̃. It will hold the same values as the previous column but squared.

The video below shows how to build the X̃.

question mark

What is the main limitation of linear regression when modeling data?

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 1. ChapterΒ 10
some-alt