Challenge: Implement Custom Optimizer Step
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: w←w−lr⋅∇wloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Soluzione
Grazie per i tuoi commenti!
single
Chieda ad AI
Chieda ad AI
Chieda pure quello che desidera o provi una delle domande suggerite per iniziare la nostra conversazione
Fantastico!
Completion tasso migliorato a 8.33
Challenge: Implement Custom Optimizer Step
Scorri per mostrare il menu
Swipe to start coding
You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.
You are given a learnable weight w and a small dataset. The code already computes predictions and loss.
Your goal is to manually perform one gradient descent step without using torch.optim.
Complete the missing parts:
- Compute gradients of
losswith respect tow. - Update
wusing SGD: w←w−lr⋅∇wloss - Reset the gradient stored in
w.gradto avoid accumulation.
After the update, the code prints the updated weight and the loss value.
Soluzione
Grazie per i tuoi commenti!
single