Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Lære Challenge: Implement Custom Optimizer Step | Optimization Algorithms in Practice
Optimization and Regularization in Neural Networks with Python

bookChallenge: Implement Custom Optimizer Step

Oppgave

Swipe to start coding

You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.

You are given a learnable weight w and a small dataset. The code already computes predictions and loss. Your goal is to manually perform one gradient descent step without using torch.optim.

Complete the missing parts:

  1. Compute gradients of loss with respect to w.
  2. Update w using SGD: wwlrwlossw \leftarrow w - lr \cdot \nabla_w loss
  3. Reset the gradient stored in w.grad to avoid accumulation.

After the update, the code prints the updated weight and the loss value.

Løsning

Alt var klart?

Hvordan kan vi forbedre det?

Takk for tilbakemeldingene dine!

Seksjon 2. Kapittel 4
single

single

Spør AI

expand

Spør AI

ChatGPT

Spør om hva du vil, eller prøv ett av de foreslåtte spørsmålene for å starte chatten vår

close

bookChallenge: Implement Custom Optimizer Step

Sveip for å vise menyen

Oppgave

Swipe to start coding

You will implement a custom optimizer step (manual SGD update) using PyTorch autograd.

You are given a learnable weight w and a small dataset. The code already computes predictions and loss. Your goal is to manually perform one gradient descent step without using torch.optim.

Complete the missing parts:

  1. Compute gradients of loss with respect to w.
  2. Update w using SGD: wwlrwlossw \leftarrow w - lr \cdot \nabla_w loss
  3. Reset the gradient stored in w.grad to avoid accumulation.

After the update, the code prints the updated weight and the loss value.

Løsning

Switch to desktopBytt til skrivebordet for virkelighetspraksisFortsett der du er med et av alternativene nedenfor
Alt var klart?

Hvordan kan vi forbedre det?

Takk for tilbakemeldingene dine!

Seksjon 2. Kapittel 4
single

single

some-alt