Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Apprendre Random Search with RandomizedSearchCV | Manual and Search-Based Tuning Methods
Hyperparameter Tuning Basics

bookRandom Search with RandomizedSearchCV

Random search quickly finds good hyperparameter settings by sampling random combinations instead of checking every possibility. This approach is especially efficient for models with many parameters, where grid search is too slow. In scikit-learn, RandomizedSearchCV automates random sampling and model evaluation for you.

Note
Note

Compare grid search vs. random search: grid search evaluates all possible combinations of the specified hyperparameters, which can be very slow as the number of parameters increases. In contrast, random search samples only a fixed number of random combinations, making it much faster in practice and often sufficient to find good results, especially when some hyperparameters are more important than others.

12345678910111213141516171819202122232425262728293031
from sklearn.ensemble import RandomForestClassifier from sklearn.model_selection import RandomizedSearchCV from sklearn.datasets import load_iris from scipy.stats import randint # Load sample data X, y = load_iris(return_X_y=True) # Define the model rf = RandomForestClassifier(random_state=42) # Specify distributions for hyperparameters param_distributions = { 'n_estimators': randint(10, 200), 'max_depth': randint(2, 20) } # Set up RandomizedSearchCV random_search = RandomizedSearchCV( estimator=rf, param_distributions=param_distributions, n_iter=10, # Number of random combinations to try cv=3, random_state=42 ) # Fit to data random_search.fit(X, y) # Show best parameters print("Best parameters found:", random_search.best_params_)
copy
question mark

In what scenario is random search likely to be more efficient than grid search?

Select the correct answer

Tout était clair ?

Comment pouvons-nous l'améliorer ?

Merci pour vos commentaires !

Section 2. Chapitre 3

Demandez à l'IA

expand

Demandez à l'IA

ChatGPT

Posez n'importe quelle question ou essayez l'une des questions suggérées pour commencer notre discussion

Awesome!

Completion rate improved to 9.09

bookRandom Search with RandomizedSearchCV

Glissez pour afficher le menu

Random search quickly finds good hyperparameter settings by sampling random combinations instead of checking every possibility. This approach is especially efficient for models with many parameters, where grid search is too slow. In scikit-learn, RandomizedSearchCV automates random sampling and model evaluation for you.

Note
Note

Compare grid search vs. random search: grid search evaluates all possible combinations of the specified hyperparameters, which can be very slow as the number of parameters increases. In contrast, random search samples only a fixed number of random combinations, making it much faster in practice and often sufficient to find good results, especially when some hyperparameters are more important than others.

12345678910111213141516171819202122232425262728293031
from sklearn.ensemble import RandomForestClassifier from sklearn.model_selection import RandomizedSearchCV from sklearn.datasets import load_iris from scipy.stats import randint # Load sample data X, y = load_iris(return_X_y=True) # Define the model rf = RandomForestClassifier(random_state=42) # Specify distributions for hyperparameters param_distributions = { 'n_estimators': randint(10, 200), 'max_depth': randint(2, 20) } # Set up RandomizedSearchCV random_search = RandomizedSearchCV( estimator=rf, param_distributions=param_distributions, n_iter=10, # Number of random combinations to try cv=3, random_state=42 ) # Fit to data random_search.fit(X, y) # Show best parameters print("Best parameters found:", random_search.best_params_)
copy
question mark

In what scenario is random search likely to be more efficient than grid search?

Select the correct answer

Tout était clair ?

Comment pouvons-nous l'améliorer ?

Merci pour vos commentaires !

Section 2. Chapitre 3
some-alt