Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Decision Boundary | Logistic Regression
Classification with Python
course content

Contenido del Curso

Classification with Python

Classification with Python

1. k-NN Classifier
2. Logistic Regression
3. Decision Tree
4. Random Forest
5. Comparing Models

bookDecision Boundary

Let's plot the results of Logistic Regression. Consider the following two-feature example:

Once we build a Logistic Regression, we can plot a decision boundary. It shows each class's region where new instances are predicted as that class. For example, here is the decision boundary of Logistic Regression applied to the data above.

We can see that the line perfectly separates two classes here. When that happens, the dataset is called linearly separable. However, that is not always the case. What if the dataset would be like this:

Above is a decision boundary for a little different dataset. Here the data is not linearly separable; hence the predictions made by Logistic Regression are imperfect.
Unfortunately, by default, the Logistic Regression cannot predict more complex decision boundaries, so it is the best prediction we can get.
But remember that Logistic Regression is derived from Linear Regression which has a solution to a problem with the model being too simple. This solution is a Polynomial Regression, and we can use its equation for calculating z to get a more complex decision boundary shape!

Just like in Polynomial Regression, we only need to apply a PolynomialFeatures transformer to our X. Here is the syntax:

It uses the equation of a second-degree polynomial regression. You can get even more complex decision boundaries making a degree higher, but as shown in the next chapter, the model may suffer from overfitting.

¿Todo estuvo claro?

¿Cómo podemos mejorarlo?

¡Gracias por tus comentarios!

Sección 2. Capítulo 4
We're sorry to hear that something went wrong. What happened?
some-alt