Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Forward Propagation | Neural Network from Scratch
Introduction to Neural Networks
course content

Зміст курсу

Introduction to Neural Networks

Introduction to Neural Networks

1. Concept of Neural Network
2. Neural Network from Scratch
3. Conclusion

bookForward Propagation

Forward Propagation Process

Forward propagation is the process by which a neural network computes its output given an input. This is achieved by successively passing the input through all the layers of the network.

Each layer transforms its input data based on its weights, biases, and the activation function, producing an output. This output becomes the input to the next layer, and the process repeats until the final layer produces the network's output.

Here's a step-by-step breakdown for our perceptron:

  1. Hidden Layer 1: The raw inputs are passed into the first hidden layer (layer1), producing layer1_outputs;

  2. Hidden Layer 2: The outputs from the first hidden layer become inputs for the second hidden layer (layer2), resulting in layer2_outputs;

  3. Output Layer: Similarly, the outputs of the second hidden layer serve as inputs for the final output layer (layer3). The output of this layer is the output of the entire neural network.

Forward Propagation Testing

To test our forward propagation we will use XOR operation to see what outputs we get for it. XOR operation takes two binary inputs and returns False (equals to 0) if the inputs are the same or True (equals to 1) if they are different.

Here is the truth table with two inputs and the corresponding output for XOR:

Input 1Input 2Output
000
011
101
110

We haven't trained our perceptron yet, so we don't expect it to give us correct answers. We just want to see that the input successfully passes through all the layers of the perceptron.

Note

You can try using XOR operation by yourself using ^ operator in python.

1234
print('0 xor 0 =', 0 ^ 0) print('0 xor 1 =', 0 ^ 1) print('1 xor 0 =', 1 ^ 0) print('1 xor 1 =', 1 ^ 1)
copy
Завдання
test

Swipe to show code editor

Implement the forward propagation function for our perceptron:

  1. Pass the inputs through the first hidden layer.
  2. Pass the outputs of the first hidden layer through the second hidden layer.
  3. Pass the outputs of the second hidden layer through the output layer.

Switch to desktopПерейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів
Все було зрозуміло?

Як ми можемо покращити це?

Дякуємо за ваш відгук!

Секція 2. Розділ 3
toggle bottom row

bookForward Propagation

Forward Propagation Process

Forward propagation is the process by which a neural network computes its output given an input. This is achieved by successively passing the input through all the layers of the network.

Each layer transforms its input data based on its weights, biases, and the activation function, producing an output. This output becomes the input to the next layer, and the process repeats until the final layer produces the network's output.

Here's a step-by-step breakdown for our perceptron:

  1. Hidden Layer 1: The raw inputs are passed into the first hidden layer (layer1), producing layer1_outputs;

  2. Hidden Layer 2: The outputs from the first hidden layer become inputs for the second hidden layer (layer2), resulting in layer2_outputs;

  3. Output Layer: Similarly, the outputs of the second hidden layer serve as inputs for the final output layer (layer3). The output of this layer is the output of the entire neural network.

Forward Propagation Testing

To test our forward propagation we will use XOR operation to see what outputs we get for it. XOR operation takes two binary inputs and returns False (equals to 0) if the inputs are the same or True (equals to 1) if they are different.

Here is the truth table with two inputs and the corresponding output for XOR:

Input 1Input 2Output
000
011
101
110

We haven't trained our perceptron yet, so we don't expect it to give us correct answers. We just want to see that the input successfully passes through all the layers of the perceptron.

Note

You can try using XOR operation by yourself using ^ operator in python.

1234
print('0 xor 0 =', 0 ^ 0) print('0 xor 1 =', 0 ^ 1) print('1 xor 0 =', 1 ^ 0) print('1 xor 1 =', 1 ^ 1)
copy
Завдання
test

Swipe to show code editor

Implement the forward propagation function for our perceptron:

  1. Pass the inputs through the first hidden layer.
  2. Pass the outputs of the first hidden layer through the second hidden layer.
  3. Pass the outputs of the second hidden layer through the output layer.

Switch to desktopПерейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів
Все було зрозуміло?

Як ми можемо покращити це?

Дякуємо за ваш відгук!

Секція 2. Розділ 3
toggle bottom row

bookForward Propagation

Forward Propagation Process

Forward propagation is the process by which a neural network computes its output given an input. This is achieved by successively passing the input through all the layers of the network.

Each layer transforms its input data based on its weights, biases, and the activation function, producing an output. This output becomes the input to the next layer, and the process repeats until the final layer produces the network's output.

Here's a step-by-step breakdown for our perceptron:

  1. Hidden Layer 1: The raw inputs are passed into the first hidden layer (layer1), producing layer1_outputs;

  2. Hidden Layer 2: The outputs from the first hidden layer become inputs for the second hidden layer (layer2), resulting in layer2_outputs;

  3. Output Layer: Similarly, the outputs of the second hidden layer serve as inputs for the final output layer (layer3). The output of this layer is the output of the entire neural network.

Forward Propagation Testing

To test our forward propagation we will use XOR operation to see what outputs we get for it. XOR operation takes two binary inputs and returns False (equals to 0) if the inputs are the same or True (equals to 1) if they are different.

Here is the truth table with two inputs and the corresponding output for XOR:

Input 1Input 2Output
000
011
101
110

We haven't trained our perceptron yet, so we don't expect it to give us correct answers. We just want to see that the input successfully passes through all the layers of the perceptron.

Note

You can try using XOR operation by yourself using ^ operator in python.

1234
print('0 xor 0 =', 0 ^ 0) print('0 xor 1 =', 0 ^ 1) print('1 xor 0 =', 1 ^ 0) print('1 xor 1 =', 1 ^ 1)
copy
Завдання
test

Swipe to show code editor

Implement the forward propagation function for our perceptron:

  1. Pass the inputs through the first hidden layer.
  2. Pass the outputs of the first hidden layer through the second hidden layer.
  3. Pass the outputs of the second hidden layer through the output layer.

Switch to desktopПерейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів
Все було зрозуміло?

Як ми можемо покращити це?

Дякуємо за ваш відгук!

Forward Propagation Process

Forward propagation is the process by which a neural network computes its output given an input. This is achieved by successively passing the input through all the layers of the network.

Each layer transforms its input data based on its weights, biases, and the activation function, producing an output. This output becomes the input to the next layer, and the process repeats until the final layer produces the network's output.

Here's a step-by-step breakdown for our perceptron:

  1. Hidden Layer 1: The raw inputs are passed into the first hidden layer (layer1), producing layer1_outputs;

  2. Hidden Layer 2: The outputs from the first hidden layer become inputs for the second hidden layer (layer2), resulting in layer2_outputs;

  3. Output Layer: Similarly, the outputs of the second hidden layer serve as inputs for the final output layer (layer3). The output of this layer is the output of the entire neural network.

Forward Propagation Testing

To test our forward propagation we will use XOR operation to see what outputs we get for it. XOR operation takes two binary inputs and returns False (equals to 0) if the inputs are the same or True (equals to 1) if they are different.

Here is the truth table with two inputs and the corresponding output for XOR:

Input 1Input 2Output
000
011
101
110

We haven't trained our perceptron yet, so we don't expect it to give us correct answers. We just want to see that the input successfully passes through all the layers of the perceptron.

Note

You can try using XOR operation by yourself using ^ operator in python.

1234
print('0 xor 0 =', 0 ^ 0) print('0 xor 1 =', 0 ^ 1) print('1 xor 0 =', 1 ^ 0) print('1 xor 1 =', 1 ^ 1)
copy
Завдання
test

Swipe to show code editor

Implement the forward propagation function for our perceptron:

  1. Pass the inputs through the first hidden layer.
  2. Pass the outputs of the first hidden layer through the second hidden layer.
  3. Pass the outputs of the second hidden layer through the output layer.

Switch to desktopПерейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів
Секція 2. Розділ 3
Switch to desktopПерейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів
We're sorry to hear that something went wrong. What happened?
some-alt