Challenge: Implement Custom Optimizer Step
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: w←w−lr⋅∇wloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Løsning
Tak for dine kommentarer!
single
Spørg AI
Spørg AI
Spørg om hvad som helst eller prøv et af de foreslåede spørgsmål for at starte vores chat
Can you explain this in simpler terms?
What are the main benefits or drawbacks?
Can you give me a real-world example?
Fantastisk!
Completion rate forbedret til 8.33
Challenge: Implement Custom Optimizer Step
Stryg for at vise menuen
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: w←w−lr⋅∇wloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Løsning
Tak for dine kommentarer!
single