Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Вивчайте Challenge: Integrate Dropout and BatchNorm | Regularization Techniques
Practice
Projects
Quizzes & Challenges
Quizzes
Challenges
/
Optimization and Regularization in Neural Networks with Python

bookChallenge: Integrate Dropout and BatchNorm

Завдання

Swipe to start coding

You will extend a simple neural network by integrating Dropout and Batch Normalization. Your goal is to correctly insert these layers into the architecture and perform a forward pass.

You are given:

  • Input batch x
  • A partially defined network class
  • A forward method missing some components

Complete the following steps:

  1. Add a Dropout layer after the first fully connected layer.

  2. Add a BatchNorm layer immediately after Dropout.

  3. Complete the forward pass so that the data flows through:

    • Linear → ReLU → Dropout → BatchNorm → Linear
  4. Ensure Dropout is used only during training (PyTorch handles this automatically).

After execution, the script prints the network output.

Рішення

Все було зрозуміло?

Як ми можемо покращити це?

Дякуємо за ваш відгук!

Секція 3. Розділ 5
single

single

Запитати АІ

expand

Запитати АІ

ChatGPT

Запитайте про що завгодно або спробуйте одне із запропонованих запитань, щоб почати наш чат

Suggested prompts:

Can you explain this in simpler terms?

What are the main benefits or drawbacks?

Can you give me a real-world example?

close

bookChallenge: Integrate Dropout and BatchNorm

Свайпніть щоб показати меню

Завдання

Swipe to start coding

You will extend a simple neural network by integrating Dropout and Batch Normalization. Your goal is to correctly insert these layers into the architecture and perform a forward pass.

You are given:

  • Input batch x
  • A partially defined network class
  • A forward method missing some components

Complete the following steps:

  1. Add a Dropout layer after the first fully connected layer.

  2. Add a BatchNorm layer immediately after Dropout.

  3. Complete the forward pass so that the data flows through:

    • Linear → ReLU → Dropout → BatchNorm → Linear
  4. Ensure Dropout is used only during training (PyTorch handles this automatically).

After execution, the script prints the network output.

Рішення

Switch to desktopПерейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів
Все було зрозуміло?

Як ми можемо покращити це?

Дякуємо за ваш відгук!

Секція 3. Розділ 5
single

single

some-alt