Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
学ぶ What is k-NN | k-NN Classifier
Classification with Python

bookWhat is k-NN

メニューを表示するにはスワイプしてください

Let's start our classification adventure with the simplest task - binary classification. Suppose we want to classify sweets as cookies/not cookies based on a single feature: their weight.

A simple way to predict the class of a new instance is to look at its closest neighbor. In our example, we must find a sweet that weighs most similarly to the new instance.

That is the idea behind k-Nearest Neighbors(k-NN) - we just look at the neighbors. The k-NN algorithm assumes that similar things exist in close proximity. In other words, similar things are near each other. k in the k-NN stands for the number of neighbors we consider when making a prediction.

In the example above, we considered only 1 neighbor, so it was 1-Nearest Neighbor. But usually, k is set to a bigger number, as looking only at one neighbor can be unreliable:

If k (number of neighbors) is greater than one, we choose the most frequent class in the neighborhood as a prediction. Here is an example of predicting two new instances with k=3:

As you can see, changing the k may cause different predictions.

Note
Note

Occasionally, k-NN produces a tie when multiple classes appear equally among the nearest neighbors. Most libraries, including scikit-learn, break ties by choosing the first class in their internal order - something to keep in mind, since it can subtly affect reproducibility and interpretation.

question mark

In the k-Nearest Neighbors algorithm, how is the class of a new instance predicted when k > 1?

正しい答えを選んでください

すべて明確でしたか?

どのように改善できますか?

フィードバックありがとうございます!

セクション 1.  2

AIに質問する

expand

AIに質問する

ChatGPT

何でも質問するか、提案された質問の1つを試してチャットを始めてください

セクション 1.  2
some-alt