Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
AdaBoost Classifier
course content

Course Content

Ensemble Learning

AdaBoost ClassifierAdaBoost Classifier

AdaBoost is an ensemble learning algorithm that focuses on improving the performance of weak learners. It works by iteratively training a sequence of weak classifiers on weighted versions of the training data. The final prediction is a weighted combination of the predictions made by these weak classifiers. AdaBoost assigns higher weights to the misclassified samples, allowing subsequent models to concentrate on the difficult-to-classify instances.

How AdaBoost Works?

  1. Initialize Weights: Assign equal weights to all training samples.
  2. Train Weak Classifier: Train a weak classifier on the training data using the current sample weights. The weak classifier aims to minimize the weighted error rate, where the weights emphasize misclassified samples.
  3. Compute Classifier Weight: Calculate the weight of the trained classifier based on its accuracy. Better classifiers are assigned higher weights.
  4. Update Sample Weights: Update the sample weights, giving higher weights to the misclassified samples from the current classifier.
  5. Repeat: Repeat steps 2-4 for a predefined number of iterations (or until a certain threshold is met).
  6. Final Prediction: Combine the predictions of all weak classifiers by summing the weighted predictions. The class with the majority vote becomes the final prediction.

Example

We can use AdaBoostClassifier class in Python to train AdaBoost model and provide predictions on real data:

Code Description
  • Create and Train AdaBoostClassifier:
  • Create an AdaBoostClassifier model with the AdaBoostClassifier(base_model, n_estimators=50) constructor. This initializes the classifier with 50 weak Logistic Regression estimators.
  • Train the model: Train the model on the training data using the .fit(X_train, y_train) method.
  • Make Predictions:
  • Use the trained AdaBoostClassifier to make predictions on the testing data by calling the .predict(X_test) method. The resulting y_pred contains the predicted labels for the test samples.
  • Calculate F1 Score:
  • Calculate the F1 score to evaluate the model's performance on the testing data. Compute the F1 score using f1_score(y_test, y_pred, average='weighted'). The 'weighted' parameter indicates that the F1 score is calculated as a weighted average across classes, considering class distribution.
    You can find the official documentation with all the necessary information about implementing this model in Python on the official website. Go here if needed.

    question-icon

    Is the following statement true: AdaBoost assigns higher weights to the samples that were CORRECTLY classified?

    Select the correct answer

    Everything was clear?

    Section 3. Chapter 1
    some-alt