Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Forward Propagation | Neural Network from Scratch
Introduction to Neural Networks

Forward PropagationForward Propagation

Forward Propagation Process

Forward propagation is the process by which a neural network computes its output given an input. This is achieved by successively passing the input through all the layers of the network.

Each layer transforms its input data based on its weights, biases, and the activation function, producing an output. This output becomes the input to the next layer, and the process repeats until the final layer produces the network's output.

Here's a step-by-step breakdown for our perceptron:

  1. Hidden Layer 1: The raw inputs are passed into the first hidden layer (layer1), producing layer1_outputs;
  2. Hidden Layer 2: The outputs from the first hidden layer become inputs for the second hidden layer (layer2), resulting in layer2_outputs;
  3. Output Layer: Similarly, the outputs of the second hidden layer serve as inputs for the final output layer (layer3). The output of this layer is the output of the entire neural network.

Forward Propagation Testing

To test our forward propagation we will use XOR operation to see what outputs we get for it. XOR operation takes two binary inputs and returns False (equals to 0) if the inputs are the same or True (equals to 1) if they are different.

Here is the truth table with two inputs and the corresponding output for XOR:

Input 1Input 2Output
000
011
101
110

We haven't trained our perceptron yet, so we don't expect it to give us correct answers. We just want to see that the input successfully passes through all the layers of the perceptron.

Note

You can try using XOR operation by yourself using ^ operator in python.

Task

Implement the forward propagation function for our perceptron:

  1. Pass the inputs through the first hidden layer.
  2. Pass the outputs of the first hidden layer through the second hidden layer.
  3. Pass the outputs of the second hidden layer through the output layer.

Everything was clear?

Section 2. Chapter 3
toggle bottom row
course content

Course Content

Introduction to Neural Networks

Forward PropagationForward Propagation

Forward Propagation Process

Forward propagation is the process by which a neural network computes its output given an input. This is achieved by successively passing the input through all the layers of the network.

Each layer transforms its input data based on its weights, biases, and the activation function, producing an output. This output becomes the input to the next layer, and the process repeats until the final layer produces the network's output.

Here's a step-by-step breakdown for our perceptron:

  1. Hidden Layer 1: The raw inputs are passed into the first hidden layer (layer1), producing layer1_outputs;
  2. Hidden Layer 2: The outputs from the first hidden layer become inputs for the second hidden layer (layer2), resulting in layer2_outputs;
  3. Output Layer: Similarly, the outputs of the second hidden layer serve as inputs for the final output layer (layer3). The output of this layer is the output of the entire neural network.

Forward Propagation Testing

To test our forward propagation we will use XOR operation to see what outputs we get for it. XOR operation takes two binary inputs and returns False (equals to 0) if the inputs are the same or True (equals to 1) if they are different.

Here is the truth table with two inputs and the corresponding output for XOR:

Input 1Input 2Output
000
011
101
110

We haven't trained our perceptron yet, so we don't expect it to give us correct answers. We just want to see that the input successfully passes through all the layers of the perceptron.

Note

You can try using XOR operation by yourself using ^ operator in python.

Task

Implement the forward propagation function for our perceptron:

  1. Pass the inputs through the first hidden layer.
  2. Pass the outputs of the first hidden layer through the second hidden layer.
  3. Pass the outputs of the second hidden layer through the output layer.

Everything was clear?

Section 2. Chapter 3
toggle bottom row
some-alt