Polynomial Regression
In the previous chapter, we explored quadratic regression, which has the graph of a parabola. In the same way, we could add the xΒ³ to the equation to get the Cubic Regression that has a more complex graph. We could also add xβ΄ and so on.
A Degree of a Polynomial Regression
In general, it is called the polynomial equation and is the equation of Polynomial Regression. The highest power of x defines a degree of a Polynomial Regression in the equation. Here is an example
N-Degree Polynomial Regression
Considering n to be a whole number greater than two, we can write down the equation of an n-degree Polynomial Regression.
ypredβ=Ξ²0β+Ξ²1βx+Ξ²2βx2+β―+Ξ²nβxnWhere:
- Ξ²0β,Ξ²1β,Ξ²2β,β¦,Ξ²nβ β are the model's parameters;
- ypredβ β is the prediction of a target;
- x β is the feature value;
- n β is the Polynomial Regression's degree.
Normal Equation
And as always, parameters are found using the Normal Equation:
Ξ²β=βΞ²0βΞ²1ββ¦Ξ²nβββ=(X~TX~)β1X~TytrueβWhere:
- Ξ²0β,Ξ²1β,β¦,Ξ²nβ β are the model's parameters;
- X β is an array of feature values from the training set;
- Xk β is the element-wise power of k of the X array;
- ytrueβ β is an array of target values from the training set.
Polynomial Regression with Multiple Features
To create even more complex shapes, you can use the Polynomial Regression with more than one feature. But even with two features, 2-degree Polynomial Regression has quite a long equation.
Most of the time, you won't need that complex model. Simpler models (like Multiple Linear Regression) usually describe the data well enough, and they are much easier to interpret, visualize and less computationally expensive.
Thanks for your feedback!
Ask AI
Ask AI
Ask anything or try one of the suggested questions to begin our chat
Awesome!
Completion rate improved to 3.33
Polynomial Regression
Swipe to show menu
In the previous chapter, we explored quadratic regression, which has the graph of a parabola. In the same way, we could add the xΒ³ to the equation to get the Cubic Regression that has a more complex graph. We could also add xβ΄ and so on.
A Degree of a Polynomial Regression
In general, it is called the polynomial equation and is the equation of Polynomial Regression. The highest power of x defines a degree of a Polynomial Regression in the equation. Here is an example
N-Degree Polynomial Regression
Considering n to be a whole number greater than two, we can write down the equation of an n-degree Polynomial Regression.
ypredβ=Ξ²0β+Ξ²1βx+Ξ²2βx2+β―+Ξ²nβxnWhere:
- Ξ²0β,Ξ²1β,Ξ²2β,β¦,Ξ²nβ β are the model's parameters;
- ypredβ β is the prediction of a target;
- x β is the feature value;
- n β is the Polynomial Regression's degree.
Normal Equation
And as always, parameters are found using the Normal Equation:
Ξ²β=βΞ²0βΞ²1ββ¦Ξ²nβββ=(X~TX~)β1X~TytrueβWhere:
- Ξ²0β,Ξ²1β,β¦,Ξ²nβ β are the model's parameters;
- X β is an array of feature values from the training set;
- Xk β is the element-wise power of k of the X array;
- ytrueβ β is an array of target values from the training set.
Polynomial Regression with Multiple Features
To create even more complex shapes, you can use the Polynomial Regression with more than one feature. But even with two features, 2-degree Polynomial Regression has quite a long equation.
Most of the time, you won't need that complex model. Simpler models (like Multiple Linear Regression) usually describe the data well enough, and they are much easier to interpret, visualize and less computationally expensive.
Thanks for your feedback!