Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Challenge: Implement Custom Optimizer Step | Optimization Algorithms in Practice
Optimization and Regularization in Neural Networks with Python

bookChallenge: Implement Custom Optimizer Step

Task

Swipe to start coding

You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.

You are given a learnable weight w and a small dataset. The code already computes predictions and loss. Your goal is to manually perform one gradient descent step without using torch.optim.

Complete the missing parts:

  1. Compute gradients of loss with respect to w.
  2. Update w using SGD: w←wβˆ’lrβ‹…βˆ‡wlossw \leftarrow w - lr \cdot \nabla_w loss
  3. Reset the gradient stored in w.grad to avoid accumulation.

After the update, the code prints the updated weight and the loss value.

Solution

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 2. ChapterΒ 4
single

single

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

close

bookChallenge: Implement Custom Optimizer Step

Swipe to show menu

Task

Swipe to start coding

You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.

You are given a learnable weight w and a small dataset. The code already computes predictions and loss. Your goal is to manually perform one gradient descent step without using torch.optim.

Complete the missing parts:

  1. Compute gradients of loss with respect to w.
  2. Update w using SGD: w←wβˆ’lrβ‹…βˆ‡wlossw \leftarrow w - lr \cdot \nabla_w loss
  3. Reset the gradient stored in w.grad to avoid accumulation.

After the update, the code prints the updated weight and the loss value.

Solution

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 2. ChapterΒ 4
single

single

some-alt