Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
KNN | Recognizing Handwritten Digits
Recognizing Handwritten Digits
course content

Conteúdo do Curso

Recognizing Handwritten Digits

bookKNN

The K-Nearest Neighbors (KNN) algorithm, a supervised machine learning technique, is predominantly utilized for classification. This algorithm operates by classifying a new data point according to the categories of its closest neighbors within the training dataset.

In the context of classification, the KNN classifier designates a class to a new data point by identifying the 'k' nearest neighbors in the training set, with 'k' being a user-defined parameter. The classification of the new data point is then determined by a majority vote among these 'k' nearest neighbors.

Despite its simplicity and adaptability, the KNN algorithm is computationally intensive for extensive datasets. It necessitates a meticulous selection of both the 'k' value and the distance metric. Nonetheless, KNN remains a widely employed and effective tool for classification tasks in the realm of machine learning.

Tarefa
test

Swipe to show code editor

  1. Initialize a K-Nearest Neighbors classifier with 4 neighbors.

  2. Train the classifier with the training data and the corresponding labels.

  3. Predict classes for the test set using the trained classifier.

Mark tasks as Completed
Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Tudo estava claro?

Como podemos melhorá-lo?

Obrigado pelo seu feedback!

The K-Nearest Neighbors (KNN) algorithm, a supervised machine learning technique, is predominantly utilized for classification. This algorithm operates by classifying a new data point according to the categories of its closest neighbors within the training dataset.

In the context of classification, the KNN classifier designates a class to a new data point by identifying the 'k' nearest neighbors in the training set, with 'k' being a user-defined parameter. The classification of the new data point is then determined by a majority vote among these 'k' nearest neighbors.

Despite its simplicity and adaptability, the KNN algorithm is computationally intensive for extensive datasets. It necessitates a meticulous selection of both the 'k' value and the distance metric. Nonetheless, KNN remains a widely employed and effective tool for classification tasks in the realm of machine learning.

Tarefa
test

Swipe to show code editor

  1. Initialize a K-Nearest Neighbors classifier with 4 neighbors.

  2. Train the classifier with the training data and the corresponding labels.

  3. Predict classes for the test set using the trained classifier.

Mark tasks as Completed
Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Seção 1. Capítulo 7
AVAILABLE TO ULTIMATE ONLY
We're sorry to hear that something went wrong. What happened?
some-alt