Multi-Class Classification
Multi-class classification with k-NN is as easy as binary classification. We just pick the class that prevails in the neighborhood.
The KNeighborsClassifier
automatically performs a multi-class classification if y
has more than two features, so you do not need to change anything. The only thing that changes is the y
variable fed to the .fit()
method.
Now you will perform a Multi-class classification with k-NN.
Consider the following dataset:
1234import pandas as pd df = pd.read_csv('https://codefinity-content-media.s3.eu-west-1.amazonaws.com/b71ff7ac-3932-41d2-a4d8-060e24b00129/starwars_multiple.csv') print(df.head())
It is the same as in the previous chapter's example, but now the target can take three values:
- 0 β "Hated it" (rating is less than 3/5);
- 1 β "Meh" (rating between 3/5 and 4/5);
- 2 β "Liked it" (rating is 4/5 or higher).
Let's move to classification! Well, wait, here is the reminder of the classes you will use.
And now, let's move to classification!
Swipe to start coding
Perform a classification using the KNeighborsClassifier
with n_neighbors
equal to 13
.
- Import the
KNeighborsClassifier
. - Use the appropriate class to scale the data.
- Scale the data using
.fit_transform()
for training data and.transform()
for new instances. - Create the
KNeighborsClassifier
object and feedX_scaled
andy
to it. - Predict the classes for new instances (
X_new_scaled
).
Solution
Thanks for your feedback!
single
Ask AI
Ask AI
Ask anything or try one of the suggested questions to begin our chat
Awesome!
Completion rate improved to 3.57Awesome!
Completion rate improved to 3.57
Multi-Class Classification
Multi-class classification with k-NN is as easy as binary classification. We just pick the class that prevails in the neighborhood.
The KNeighborsClassifier
automatically performs a multi-class classification if y
has more than two features, so you do not need to change anything. The only thing that changes is the y
variable fed to the .fit()
method.
Now you will perform a Multi-class classification with k-NN.
Consider the following dataset:
1234import pandas as pd df = pd.read_csv('https://codefinity-content-media.s3.eu-west-1.amazonaws.com/b71ff7ac-3932-41d2-a4d8-060e24b00129/starwars_multiple.csv') print(df.head())
It is the same as in the previous chapter's example, but now the target can take three values:
- 0 β "Hated it" (rating is less than 3/5);
- 1 β "Meh" (rating between 3/5 and 4/5);
- 2 β "Liked it" (rating is 4/5 or higher).
Let's move to classification! Well, wait, here is the reminder of the classes you will use.
And now, let's move to classification!
Swipe to start coding
Perform a classification using the KNeighborsClassifier
with n_neighbors
equal to 13
.
- Import the
KNeighborsClassifier
. - Use the appropriate class to scale the data.
- Scale the data using
.fit_transform()
for training data and.transform()
for new instances. - Create the
KNeighborsClassifier
object and feedX_scaled
andy
to it. - Predict the classes for new instances (
X_new_scaled
).
Solution
Thanks for your feedback!
single
Awesome!
Completion rate improved to 3.57
Multi-Class Classification
Swipe to show menu
Multi-class classification with k-NN is as easy as binary classification. We just pick the class that prevails in the neighborhood.
The KNeighborsClassifier
automatically performs a multi-class classification if y
has more than two features, so you do not need to change anything. The only thing that changes is the y
variable fed to the .fit()
method.
Now you will perform a Multi-class classification with k-NN.
Consider the following dataset:
1234import pandas as pd df = pd.read_csv('https://codefinity-content-media.s3.eu-west-1.amazonaws.com/b71ff7ac-3932-41d2-a4d8-060e24b00129/starwars_multiple.csv') print(df.head())
It is the same as in the previous chapter's example, but now the target can take three values:
- 0 β "Hated it" (rating is less than 3/5);
- 1 β "Meh" (rating between 3/5 and 4/5);
- 2 β "Liked it" (rating is 4/5 or higher).
Let's move to classification! Well, wait, here is the reminder of the classes you will use.
And now, let's move to classification!
Swipe to start coding
Perform a classification using the KNeighborsClassifier
with n_neighbors
equal to 13
.
- Import the
KNeighborsClassifier
. - Use the appropriate class to scale the data.
- Scale the data using
.fit_transform()
for training data and.transform()
for new instances. - Create the
KNeighborsClassifier
object and feedX_scaled
andy
to it. - Predict the classes for new instances (
X_new_scaled
).
Solution
Thanks for your feedback!