Challenge: Implement Custom Optimizer Step
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: w←w−lr⋅∇wloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Lösning
Tack för dina kommentarer!
single
Fråga AI
Fråga AI
Fråga vad du vill eller prova någon av de föreslagna frågorna för att starta vårt samtal
Fantastiskt!
Completion betyg förbättrat till 8.33
Challenge: Implement Custom Optimizer Step
Svep för att visa menyn
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: w←w−lr⋅∇wloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Lösning
Tack för dina kommentarer!
single