Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Tune Hyperparameters with RandomizedSearchCV | Modeling
ML Introduction with scikit-learn

Tune Hyperparameters with  RandomizedSearchCVTune Hyperparameters with RandomizedSearchCV

The idea behind RandomizedSearchCV is that it works the same as GridSearchCV, but instead of trying all the combinations, it tries a randomly sampled subset.
For example, this param_grid will have 100 combinations

The GridSearchCV would try all of them, which is time-consuming.
With RandomizedSearchCV, you can try only a randomly chosen subset of, say, 20 combinations.
It usually leads to a little worse result, but it is much faster.
You can control the number of combinations to be tested using the n_iter argument (set to 10 by default). Apart from that, working with it is the same as with GridSearchCV.

Task

Your task is to build GridSearchCV and RandomizedSearchCV with 20 combinations and compare the results.

  1. Initialize the RandomizedSearchCV object. Pass the parameters grid and set the number of combinations to 20.
  2. Initialize the GridSearchCV object.
  3. Train both GridSearchCV and RandomizedSearchCV objects.
  4. Print the best estimator of grid.
  5. Print the best score of randomized.

Note

You can try running the code several times. Look at the difference between the two scores. Sometimes the scores can be the same due to the presence of the best parameters among combinations sampled by RandomizedSearchCV.

Everything was clear?

Section 4. Chapter 8
toggle bottom row
course content

Course Content

ML Introduction with scikit-learn

Tune Hyperparameters with  RandomizedSearchCVTune Hyperparameters with RandomizedSearchCV

The idea behind RandomizedSearchCV is that it works the same as GridSearchCV, but instead of trying all the combinations, it tries a randomly sampled subset.
For example, this param_grid will have 100 combinations

The GridSearchCV would try all of them, which is time-consuming.
With RandomizedSearchCV, you can try only a randomly chosen subset of, say, 20 combinations.
It usually leads to a little worse result, but it is much faster.
You can control the number of combinations to be tested using the n_iter argument (set to 10 by default). Apart from that, working with it is the same as with GridSearchCV.

Task

Your task is to build GridSearchCV and RandomizedSearchCV with 20 combinations and compare the results.

  1. Initialize the RandomizedSearchCV object. Pass the parameters grid and set the number of combinations to 20.
  2. Initialize the GridSearchCV object.
  3. Train both GridSearchCV and RandomizedSearchCV objects.
  4. Print the best estimator of grid.
  5. Print the best score of randomized.

Note

You can try running the code several times. Look at the difference between the two scores. Sometimes the scores can be the same due to the presence of the best parameters among combinations sampled by RandomizedSearchCV.

Everything was clear?

Section 4. Chapter 8
toggle bottom row
some-alt