Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Challenge: Tuning Hyperparameters with RandomizedSearchCV | Modeling
Quizzes & Challenges
Quizzes
Challenges
/
Introduction to Machine Learning with Python

bookChallenge: Tuning Hyperparameters with RandomizedSearchCV

RandomizedSearchCV works like GridSearchCV, but instead of checking every hyperparameter combination, it evaluates a random subset. In the example below, the grid contains 100 combinations. GridSearchCV tests all of them, while RandomizedSearchCV can sample, for example, 20 β€” controlled by n_iter. This makes tuning faster, while usually finding a score close to the best.

Task

Swipe to start coding

You have a preprocessed penguin dataset. Tune a KNeighborsClassifier using both search methods:

  1. Create param_grid with values for n_neighbors, weights, and p.
  2. Initialize RandomizedSearchCV(..., n_iter=20).
  3. Initialize GridSearchCV with the same grid.
  4. Fit both searches on X, y.
  5. Print the grid search’s .best_estimator_.
  6. Print the randomized search’s .best_score_.

Solution

Note
Note

Try running the code multiple times. RandomizedSearchCV may match the grid search score when it randomly samples the best hyperparameters.

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 4. ChapterΒ 8
single

single

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

Suggested prompts:

Can you explain how to choose the value for `n_iter` in RandomizedSearchCV?

What are the main advantages and disadvantages of using RandomizedSearchCV compared to GridSearchCV?

Can you give an example of when RandomizedSearchCV would be preferred over GridSearchCV?

close

bookChallenge: Tuning Hyperparameters with RandomizedSearchCV

Swipe to show menu

RandomizedSearchCV works like GridSearchCV, but instead of checking every hyperparameter combination, it evaluates a random subset. In the example below, the grid contains 100 combinations. GridSearchCV tests all of them, while RandomizedSearchCV can sample, for example, 20 β€” controlled by n_iter. This makes tuning faster, while usually finding a score close to the best.

Task

Swipe to start coding

You have a preprocessed penguin dataset. Tune a KNeighborsClassifier using both search methods:

  1. Create param_grid with values for n_neighbors, weights, and p.
  2. Initialize RandomizedSearchCV(..., n_iter=20).
  3. Initialize GridSearchCV with the same grid.
  4. Fit both searches on X, y.
  5. Print the grid search’s .best_estimator_.
  6. Print the randomized search’s .best_score_.

Solution

Note
Note

Try running the code multiple times. RandomizedSearchCV may match the grid search score when it randomly samples the best hyperparameters.

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 4. ChapterΒ 8
single

single

some-alt