Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Lære Challenge: Integrate Dropout and BatchNorm | Regularization Techniques
Optimization and Regularization in Neural Networks with Python

bookChallenge: Integrate Dropout and BatchNorm

Opgave

Swipe to start coding

You will extend a simple neural network by integrating Dropout and Batch Normalization. Your goal is to correctly insert these layers into the architecture and perform a forward pass.

You are given:

  • Input batch x
  • A partially defined network class
  • A forward method missing some components

Complete the following steps:

  1. Add a Dropout layer after the first fully connected layer.

  2. Add a BatchNorm layer immediately after Dropout.

  3. Complete the forward pass so that the data flows through:

    • Linear → ReLU → Dropout → BatchNorm → Linear
  4. Ensure Dropout is used only during training (PyTorch handles this automatically).

After execution, the script prints the network output.

Løsning

Var alt klart?

Hvordan kan vi forbedre det?

Tak for dine kommentarer!

Sektion 3. Kapitel 5
single

single

Spørg AI

expand

Spørg AI

ChatGPT

Spørg om hvad som helst eller prøv et af de foreslåede spørgsmål for at starte vores chat

Suggested prompts:

Can you explain this in simpler terms?

What are the main benefits or drawbacks?

Can you give me a real-world example?

close

bookChallenge: Integrate Dropout and BatchNorm

Stryg for at vise menuen

Opgave

Swipe to start coding

You will extend a simple neural network by integrating Dropout and Batch Normalization. Your goal is to correctly insert these layers into the architecture and perform a forward pass.

You are given:

  • Input batch x
  • A partially defined network class
  • A forward method missing some components

Complete the following steps:

  1. Add a Dropout layer after the first fully connected layer.

  2. Add a BatchNorm layer immediately after Dropout.

  3. Complete the forward pass so that the data flows through:

    • Linear → ReLU → Dropout → BatchNorm → Linear
  4. Ensure Dropout is used only during training (PyTorch handles this automatically).

After execution, the script prints the network output.

Løsning

Switch to desktopSkift til skrivebord for at øve i den virkelige verdenFortsæt der, hvor du er, med en af nedenstående muligheder
Var alt klart?

Hvordan kan vi forbedre det?

Tak for dine kommentarer!

Sektion 3. Kapitel 5
single

single

some-alt