Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Oppiskele Multi-Step Backpropagation | More Advanced Concepts
PyTorch Essentials
course content

Kurssisisältö

PyTorch Essentials

PyTorch Essentials

1. PyTorch Introduction
2. More Advanced Concepts
3. Neural Networks in PyTorch

book
Multi-Step Backpropagation

Like Tensorflow, PyTorch also allows you to build more complex computational graphs involving multiple intermediate tensors.

12345678910111213
import torch # Create a 2D tensor with gradient tracking x = torch.tensor([[1.0, 2.0, 3.0], [3.0, 2.0, 1.0]], requires_grad=True) # Define intermediate layers y = 6 * x + 3 z = 10 * y ** 2 # Compute the mean of the final output output_mean = z.mean() print(f"Output: {output_mean}") # Perform backpropagation output_mean.backward() # Print the gradient of x print("Gradient of x:\n", x.grad)
copy

The gradient of output_mean with respect to x is computed using the chain rule. The result shows how much a small change in each element of x affects output_mean.

Disabling Gradient Tracking

In some cases, you may want to disable gradient tracking to save memory and computation. Since requires_grad=False is the default behavior, you can simply create the tensor without specifying this parameter:

python
Tehtävä

Swipe to start coding

You are tasked with building a simple neural network in PyTorch. Your goal is to compute the gradient of the loss with respect to the weight matrix.

  1. Define a random weight matrix (tensor) W of shape 1x3 initialized with values from a uniform distribution over [0, 1], with gradient tracking enabled.
  2. Create an input matrix (tensor) X based on this list: [[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]].
  3. Perform matrix multiplication between W and X to calculate Y.
  4. Compute mean squared error (MSE): loss = mean((Y - Ytarget)2).
  5. Calculate the gradient of the loss (loss) with respect to W using backpropagation.
  6. Print the computed gradient of W.

Ratkaisu

Switch to desktopVaihda työpöytään todellista harjoitusta vartenJatka siitä, missä olet käyttämällä jotakin alla olevista vaihtoehdoista
Oliko kaikki selvää?

Miten voimme parantaa sitä?

Kiitos palautteestasi!

Osio 2. Luku 2
toggle bottom row

book
Multi-Step Backpropagation

Like Tensorflow, PyTorch also allows you to build more complex computational graphs involving multiple intermediate tensors.

12345678910111213
import torch # Create a 2D tensor with gradient tracking x = torch.tensor([[1.0, 2.0, 3.0], [3.0, 2.0, 1.0]], requires_grad=True) # Define intermediate layers y = 6 * x + 3 z = 10 * y ** 2 # Compute the mean of the final output output_mean = z.mean() print(f"Output: {output_mean}") # Perform backpropagation output_mean.backward() # Print the gradient of x print("Gradient of x:\n", x.grad)
copy

The gradient of output_mean with respect to x is computed using the chain rule. The result shows how much a small change in each element of x affects output_mean.

Disabling Gradient Tracking

In some cases, you may want to disable gradient tracking to save memory and computation. Since requires_grad=False is the default behavior, you can simply create the tensor without specifying this parameter:

python
Tehtävä

Swipe to start coding

You are tasked with building a simple neural network in PyTorch. Your goal is to compute the gradient of the loss with respect to the weight matrix.

  1. Define a random weight matrix (tensor) W of shape 1x3 initialized with values from a uniform distribution over [0, 1], with gradient tracking enabled.
  2. Create an input matrix (tensor) X based on this list: [[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]].
  3. Perform matrix multiplication between W and X to calculate Y.
  4. Compute mean squared error (MSE): loss = mean((Y - Ytarget)2).
  5. Calculate the gradient of the loss (loss) with respect to W using backpropagation.
  6. Print the computed gradient of W.

Ratkaisu

Switch to desktopVaihda työpöytään todellista harjoitusta vartenJatka siitä, missä olet käyttämällä jotakin alla olevista vaihtoehdoista
Oliko kaikki selvää?

Miten voimme parantaa sitä?

Kiitos palautteestasi!

Osio 2. Luku 2
Switch to desktopVaihda työpöytään todellista harjoitusta vartenJatka siitä, missä olet käyttämällä jotakin alla olevista vaihtoehdoista
Pahoittelemme, että jotain meni pieleen. Mitä tapahtui?
some-alt