Challenge: Implement Custom Optimizer Step
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: w←w−lr⋅∇wloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Lösung
Danke für Ihr Feedback!
single
Fragen Sie AI
Fragen Sie AI
Fragen Sie alles oder probieren Sie eine der vorgeschlagenen Fragen, um unser Gespräch zu beginnen
Can you explain this in simpler terms?
What are the main benefits or drawbacks?
Can you give me a real-world example?
Großartig!
Completion Rate verbessert auf 8.33
Challenge: Implement Custom Optimizer Step
Swipe um das Menü anzuzeigen
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: w←w−lr⋅∇wloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Lösung
Danke für Ihr Feedback!
single