Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Apprendre Challenge: Integrate Dropout and BatchNorm | Regularization Techniques
Optimization and Regularization in Neural Networks with Python

bookChallenge: Integrate Dropout and BatchNorm

Tâche

Swipe to start coding

You will extend a simple neural network by integrating Dropout and Batch Normalization. Your goal is to correctly insert these layers into the architecture and perform a forward pass.

You are given:

  • Input batch x
  • A partially defined network class
  • A forward method missing some components

Complete the following steps:

  1. Add a Dropout layer after the first fully connected layer.

  2. Add a BatchNorm layer immediately after Dropout.

  3. Complete the forward pass so that the data flows through:

    • Linear → ReLU → Dropout → BatchNorm → Linear
  4. Ensure Dropout is used only during training (PyTorch handles this automatically).

After execution, the script prints the network output.

Solution

Tout était clair ?

Comment pouvons-nous l'améliorer ?

Merci pour vos commentaires !

Section 3. Chapitre 5
single

single

Demandez à l'IA

expand

Demandez à l'IA

ChatGPT

Posez n'importe quelle question ou essayez l'une des questions suggérées pour commencer notre discussion

Suggested prompts:

Can you explain this in simpler terms?

What are the main benefits or drawbacks?

Can you give me a real-world example?

close

bookChallenge: Integrate Dropout and BatchNorm

Glissez pour afficher le menu

Tâche

Swipe to start coding

You will extend a simple neural network by integrating Dropout and Batch Normalization. Your goal is to correctly insert these layers into the architecture and perform a forward pass.

You are given:

  • Input batch x
  • A partially defined network class
  • A forward method missing some components

Complete the following steps:

  1. Add a Dropout layer after the first fully connected layer.

  2. Add a BatchNorm layer immediately after Dropout.

  3. Complete the forward pass so that the data flows through:

    • Linear → ReLU → Dropout → BatchNorm → Linear
  4. Ensure Dropout is used only during training (PyTorch handles this automatically).

After execution, the script prints the network output.

Solution

Switch to desktopPassez à un bureau pour une pratique réelleContinuez d'où vous êtes en utilisant l'une des options ci-dessous
Tout était clair ?

Comment pouvons-nous l'améliorer ?

Merci pour vos commentaires !

Section 3. Chapitre 5
single

single

some-alt