Зміст курсу
PyTorch Essentials
PyTorch Essentials
Creating Random Tensors
Random tensors are useful for initializing data or weights in machine learning models (the most common use case).
Random Uniform Tensors
The torch.rand()
function is used to create a tensor with random values drawn from a uniform distribution between 0
and 1
. Similarly to the zeros()
and ones()
functions, the arguments specify the shape of the tensor.
import torch # Create a 6x8 tensor with random values between 0 and 1 random_tensor = torch.rand(6, 8) print(random_tensor)
Random Normal Tensors
The torch.randn()
function is used to create a tensor with random values drawn from a standard normal distribution (mean = 0, standard deviation = 1).
import torch # Create a 2x2 tensor with random values from a normal distribution normal_tensor = torch.randn(2, 2) print(normal_tensor)
Random Integer Tensors
The torch.randint()
function is used to create a tensor with random integer values drawn from discrete uniform distribution.
The first two parameters of this functions (low
, which is equal to 0
by default, and high
) specify the range of values (from low
to high
exclusive). The next parameter specifies the shape of the tensor as a tuple.
import torch # Create a 4x3 tensor with random integers between 0 and 10 integer_tensor = torch.randint(0, 10, (4, 3)) print(integer_tensor)
Setting Random Seed
To ensure reproducibility, you can set a manual seed. This fixes the random numbers generated so they are the same each time you run the code.
import torch # Set the random seed torch.manual_seed(42) # Create a 2x3 tensor with random values seeded_tensor = torch.rand(2, 3) print(seeded_tensor)
Practical Use Cases for Random Tensors
- Weight initialization: random tensors are often used to initialize weights in neural networks;
- Simulating data: generate random datasets for testing and experimentation;
- Random sampling: use random tensors for tasks like dropout and noise addition in models.
Дякуємо за ваш відгук!