Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Challenge: Integrate Dropout and BatchNorm | Regularization Techniques
Optimization and Regularization in Neural Networks with Python
Section 3. Chapter 5
single

single

bookChallenge: Integrate Dropout and BatchNorm

Swipe to show menu

Task

Swipe to start coding

You will extend a simple neural network by integrating Dropout and Batch Normalization. Your goal is to correctly insert these layers into the architecture and perform a forward pass.

You are given:

  • Input batch x
  • A partially defined network class
  • A forward method missing some components

Complete the following steps:

  1. Add a Dropout layer after the first fully connected layer.

  2. Add a BatchNorm layer immediately after Dropout.

  3. Complete the forward pass so that the data flows through:

    • Linear → ReLU → Dropout → BatchNorm → Linear
  4. Ensure Dropout is used only during training (PyTorch handles this automatically).

After execution, the script prints the network output.

Solution

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

Section 3. Chapter 5
single

single

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

some-alt