Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Automatic Hyperparameter Tuning | Conclusion
Introduction to Neural Networks

Automatic Hyperparameter TuningAutomatic Hyperparameter Tuning

Task

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Everything was clear?

Section 3. Chapter 4
toggle bottom row
course content

Course Content

Introduction to Neural Networks

Automatic Hyperparameter TuningAutomatic Hyperparameter Tuning

Task

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Everything was clear?

Section 3. Chapter 4
toggle bottom row
some-alt