Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Lære Challenge: Automatic Hyperparameter Tuning | Conclusion
Introduction to Neural Networks

Sveip for å vise menyen

book
Challenge: Automatic Hyperparameter Tuning

Rather than manually selecting specific values for our model's hyperparameters, randomized search (RandomizedSearchCV) offers a more efficient way to find an optimal configuration. Unlike grid search (GridSearchCV), which systematically evaluates all possible combinations of hyperparameters, randomized search selects a random subset of these combinations. This approach significantly reduces computational cost while still yielding strong results.

For neural networks, where the number of possible hyperparameter combinations can be immense, exhaustively testing every option is often impractical. Randomized search circumvents this issue by randomly sampling a defined number of hyperparameter sets, balancing exploration and efficiency.

python
  • estimator: the model to optimize (e.g., MLPClassifier);

  • param_distributions: a dictionary where keys are hyperparameter names and values are lists which to sample;

  • n_iter: specifies how many random combinations should be tested. A higher value increases the chances of finding an optimal combination but requires more computation;

  • scoring: defines the evaluation metric (e.g., 'accuracy' for classification).

Oppgave

Swipe to start coding

  1. In param_distributions, generate values for two hidden layers, where each layer has the same number of neurons, ranging from 20 to 30 (inclusive) with a step of 2.
  2. In param_distributions, set the learning rate values to 0.02, 0.01, and 0.005.
  3. In param_distributions, generate 10 random values for the number of training epochs, ensuring they are within the range 10 to 50 (exclusive).
  4. Apply randomized search with 4 iterations (number of hyperparameter combinations to evaluate) and use accuracy as the evaluation metric.

Løsning

Switch to desktopBytt til skrivebordet for virkelighetspraksisFortsett der du er med et av alternativene nedenfor
Alt var klart?

Hvordan kan vi forbedre det?

Takk for tilbakemeldingene dine!

Seksjon 3. Kapittel 3
single

single

Spør AI

expand

Spør AI

ChatGPT

Spør om hva du vil, eller prøv ett av de foreslåtte spørsmålene for å starte chatten vår

close

Awesome!

Completion rate improved to 4

book
Challenge: Automatic Hyperparameter Tuning

Rather than manually selecting specific values for our model's hyperparameters, randomized search (RandomizedSearchCV) offers a more efficient way to find an optimal configuration. Unlike grid search (GridSearchCV), which systematically evaluates all possible combinations of hyperparameters, randomized search selects a random subset of these combinations. This approach significantly reduces computational cost while still yielding strong results.

For neural networks, where the number of possible hyperparameter combinations can be immense, exhaustively testing every option is often impractical. Randomized search circumvents this issue by randomly sampling a defined number of hyperparameter sets, balancing exploration and efficiency.

python
  • estimator: the model to optimize (e.g., MLPClassifier);

  • param_distributions: a dictionary where keys are hyperparameter names and values are lists which to sample;

  • n_iter: specifies how many random combinations should be tested. A higher value increases the chances of finding an optimal combination but requires more computation;

  • scoring: defines the evaluation metric (e.g., 'accuracy' for classification).

Oppgave

Swipe to start coding

  1. In param_distributions, generate values for two hidden layers, where each layer has the same number of neurons, ranging from 20 to 30 (inclusive) with a step of 2.
  2. In param_distributions, set the learning rate values to 0.02, 0.01, and 0.005.
  3. In param_distributions, generate 10 random values for the number of training epochs, ensuring they are within the range 10 to 50 (exclusive).
  4. Apply randomized search with 4 iterations (number of hyperparameter combinations to evaluate) and use accuracy as the evaluation metric.

Løsning

Switch to desktopBytt til skrivebordet for virkelighetspraksisFortsett der du er med et av alternativene nedenfor
Alt var klart?

Hvordan kan vi forbedre det?

Takk for tilbakemeldingene dine!

close

Awesome!

Completion rate improved to 4

Sveip for å vise menyen

some-alt