Polynomial Regression
Swipe to show menu
In the previous chapter, we explored quadratic regression, which has the graph of a parabola. In the same way, we could add the xยณ to the equation to get the Cubic Regression that has a more complex graph. We could also add xโด and so on.
A Degree of a Polynomial Regression
In general, it is called the polynomial equation and is the equation of Polynomial Regression. The highest power of x defines a degree of a Polynomial Regression in the equation. Here is an example
N-Degree Polynomial Regression
Considering n to be a whole number greater than two, we can write down the equation of an n-degree Polynomial Regression.
ypredโ=ฮฒ0โ+ฮฒ1โx+ฮฒ2โx2+โฏ+ฮฒnโxnWhere:
- ฮฒ0โ,ฮฒ1โ,ฮฒ2โ,โฆ,ฮฒnโ โ are the model's parameters;
- ypredโ โ is the prediction of a target;
- x โ is the feature value;
- n โ is the Polynomial Regression's degree.
Normal Equation
And as always, parameters are found using the Normal Equation:
ฮฒโ=โฮฒ0โฮฒ1โโฆฮฒnโโโ=(X~TX~)โ1X~TytrueโWhere:
- ฮฒ0โ,ฮฒ1โ,โฆ,ฮฒnโ โ are the model's parameters;
- X โ is an array of feature values from the training set;
- Xk โ is the element-wise power of k of the X array;
- ytrueโ โ is an array of target values from the training set.
Polynomial Regression with Multiple Features
To create even more complex shapes, you can use the Polynomial Regression with more than one feature. But even with two features, 2-degree Polynomial Regression has quite a long equation.
Most of the time, you won't need that complex model. Simpler models (like Multiple Linear Regression) usually describe the data well enough, and they are much easier to interpret, visualize and less computationally expensive.
Thanks for your feedback!
Ask AI
Ask AI
Ask anything or try one of the suggested questions to begin our chat