Contenido del Curso
Ensemble Learning
Ensemble Learning
AdaBoost Classifier
AdaBoost is an ensemble learning algorithm that focuses on improving the performance of weak learners. It works by iteratively training a sequence of weak classifiers on weighted versions of the training data. The final prediction is a weighted combination of the predictions made by these weak classifiers. AdaBoost assigns higher weights to the misclassified samples, allowing subsequent models to concentrate on the difficult-to-classify instances.
How AdaBoost Works?
- Initialize Weights: Assign equal weights to all training samples;
- Train Weak Classifier: Train a weak classifier on the training data using the current sample weights. The weak classifier aims to minimize the weighted error rate, where the weights emphasize misclassified samples;
- Compute Classifier Weight: Calculate the weight of the trained classifier based on its accuracy. Better classifiers are assigned higher weights;
- Update Sample Weights: Update the sample weights, giving higher weights to the misclassified samples from the current classifier;
- Repeat: Repeat steps 2-4 for a predefined number of iterations (or until a certain threshold is met);
- Final Prediction: Combine the predictions of all weak classifiers by summing the weighted predictions. The class with the majority vote becomes the final prediction.
Example
We can use AdaBoostClassifier
class in Python to train AdaBoost model and provide predictions on real data:
from sklearn.datasets import load_iris from sklearn.model_selection import train_test_split from sklearn.linear_model import LogisticRegression from sklearn.ensemble import AdaBoostClassifier from sklearn.metrics import f1_score # Load the Iris dataset data = load_iris() X = data.data y = data.target # Split the data into training and testing sets X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) # Create a Logistic Regression base model base_model = LogisticRegression() # Create and train the AdaBoost Classifier with Logistic Regression as base model classifier = AdaBoostClassifier(base_model, n_estimators=50) classifier.fit(X_train, y_train) # Make predictions y_pred = classifier.predict(X_test) # Calculate F1 score f1 = f1_score(y_test, y_pred, average='weighted') print(f'F1 Score: {f1:.4f}')
¡Gracias por tus comentarios!