Challenge: Integrate Dropout and BatchNorm
Swipe to start coding
You will extend a simple neural network by integrating Dropout and Batch Normalization. Your goal is to correctly insert these layers into the architecture and perform a forward pass.
You are given:
- Input batch
x - A partially defined network class
- A forward method missing some components
Complete the following steps:
-
Add a Dropout layer after the first fully connected layer.
-
Add a BatchNorm layer immediately after Dropout.
-
Complete the forward pass so that the data flows through:
- Linear → ReLU → Dropout → BatchNorm → Linear
-
Ensure Dropout is used only during training (PyTorch handles this automatically).
After execution, the script prints the network output.
Ratkaisu
Kiitos palautteestasi!
single
Kysy tekoälyä
Kysy tekoälyä
Kysy mitä tahansa tai kokeile jotakin ehdotetuista kysymyksistä aloittaaksesi keskustelumme
Can you explain this in simpler terms?
What are the main benefits or drawbacks?
Can you give me a real-world example?
Mahtavaa!
Completion arvosana parantunut arvoon 8.33
Challenge: Integrate Dropout and BatchNorm
Pyyhkäise näyttääksesi valikon
Swipe to start coding
You will extend a simple neural network by integrating Dropout and Batch Normalization. Your goal is to correctly insert these layers into the architecture and perform a forward pass.
You are given:
- Input batch
x - A partially defined network class
- A forward method missing some components
Complete the following steps:
-
Add a Dropout layer after the first fully connected layer.
-
Add a BatchNorm layer immediately after Dropout.
-
Complete the forward pass so that the data flows through:
- Linear → ReLU → Dropout → BatchNorm → Linear
-
Ensure Dropout is used only during training (PyTorch handles this automatically).
After execution, the script prints the network output.
Ratkaisu
Kiitos palautteestasi!
single