Challenge: Implement Custom Optimizer Step
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: wβwβlrβ βwβloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Solution
Thanks for your feedback!
single
Ask AI
Ask AI
Ask anything or try one of the suggested questions to begin our chat
Awesome!
Completion rate improved to 8.33
Challenge: Implement Custom Optimizer Step
Swipe to show menu
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: wβwβlrβ βwβloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Solution
Thanks for your feedback!
single