Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Challenge: Creating a Neural Network Layer | Tensors
Introduction to TensorFlow
course content

Course Content

Introduction to TensorFlow

Introduction to TensorFlow

1. Tensors
2. Basics of TensorFlow

bookChallenge: Creating a Neural Network Layer

Single Neural Network Layer

In a basic feed-forward neural network, the output of a neuron in a layer is calculated using the weighted sum of its inputs, passed through an activation function. This can be represented as:

output = activation(inputs * weights + bias)

Where:

  • weights: A matrix representing the weights associated with connections to the neuron;
  • inputs: A column matrix (or vector) representing the input values to the neuron;
  • bias: A scalar value;
  • activation: An activation function, such as the sigmoid function.

To achieve the best performance, all calculations are performed using matrices. We will cope with this task in the same way.

Task
test

Swipe to show code editor

Given weights, inputs, and bias for a single neuron layer, compute its output using matrix multiplication and the sigmoid activation function. Consider a layer with 3 inputs and 2 neurons, taking a single batch containing just one sample.

  1. Determining Shapes:

    • The shape of the input matrix I should have its first dimension representing the number of samples in the batch. Given one sample with 3 inputs, its size will be 1x3.
    • The weight matrix W should have its columns representing input weights for each neuron. So for 2 neurons with 3 inputs, the expected shape is 3x2. This isn't the case, so you need to transpose the weight matrix to achieve the required shape.
  2. Matrix Multiplication:

    • With the matrices in the correct shape, perform the matrix multiplication.
    • Recall that in matrix multiplication, the output is derived from the dot product of each row of the first matrix with each column of the second matrix. Ensure you multiply in the right order.
  3. Bias Addition:

    • Simply perform an element-wise addition of the result from the matrix multiplication with the bias.
  4. Applying Activation:

    • Use the sigmoid activation function on the outcome from the bias addition to get the output of the neuron.
    • TensorFlow provides the sigmoid function as tf.sigmoid().

Note

In the end of the course we'll delve into implementing a complete feed-forward network using TensorFlow. This exercise is setting the foundation for that.

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

Section 1. Chapter 10
toggle bottom row

bookChallenge: Creating a Neural Network Layer

Single Neural Network Layer

In a basic feed-forward neural network, the output of a neuron in a layer is calculated using the weighted sum of its inputs, passed through an activation function. This can be represented as:

output = activation(inputs * weights + bias)

Where:

  • weights: A matrix representing the weights associated with connections to the neuron;
  • inputs: A column matrix (or vector) representing the input values to the neuron;
  • bias: A scalar value;
  • activation: An activation function, such as the sigmoid function.

To achieve the best performance, all calculations are performed using matrices. We will cope with this task in the same way.

Task
test

Swipe to show code editor

Given weights, inputs, and bias for a single neuron layer, compute its output using matrix multiplication and the sigmoid activation function. Consider a layer with 3 inputs and 2 neurons, taking a single batch containing just one sample.

  1. Determining Shapes:

    • The shape of the input matrix I should have its first dimension representing the number of samples in the batch. Given one sample with 3 inputs, its size will be 1x3.
    • The weight matrix W should have its columns representing input weights for each neuron. So for 2 neurons with 3 inputs, the expected shape is 3x2. This isn't the case, so you need to transpose the weight matrix to achieve the required shape.
  2. Matrix Multiplication:

    • With the matrices in the correct shape, perform the matrix multiplication.
    • Recall that in matrix multiplication, the output is derived from the dot product of each row of the first matrix with each column of the second matrix. Ensure you multiply in the right order.
  3. Bias Addition:

    • Simply perform an element-wise addition of the result from the matrix multiplication with the bias.
  4. Applying Activation:

    • Use the sigmoid activation function on the outcome from the bias addition to get the output of the neuron.
    • TensorFlow provides the sigmoid function as tf.sigmoid().

Note

In the end of the course we'll delve into implementing a complete feed-forward network using TensorFlow. This exercise is setting the foundation for that.

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

Section 1. Chapter 10
toggle bottom row

bookChallenge: Creating a Neural Network Layer

Single Neural Network Layer

In a basic feed-forward neural network, the output of a neuron in a layer is calculated using the weighted sum of its inputs, passed through an activation function. This can be represented as:

output = activation(inputs * weights + bias)

Where:

  • weights: A matrix representing the weights associated with connections to the neuron;
  • inputs: A column matrix (or vector) representing the input values to the neuron;
  • bias: A scalar value;
  • activation: An activation function, such as the sigmoid function.

To achieve the best performance, all calculations are performed using matrices. We will cope with this task in the same way.

Task
test

Swipe to show code editor

Given weights, inputs, and bias for a single neuron layer, compute its output using matrix multiplication and the sigmoid activation function. Consider a layer with 3 inputs and 2 neurons, taking a single batch containing just one sample.

  1. Determining Shapes:

    • The shape of the input matrix I should have its first dimension representing the number of samples in the batch. Given one sample with 3 inputs, its size will be 1x3.
    • The weight matrix W should have its columns representing input weights for each neuron. So for 2 neurons with 3 inputs, the expected shape is 3x2. This isn't the case, so you need to transpose the weight matrix to achieve the required shape.
  2. Matrix Multiplication:

    • With the matrices in the correct shape, perform the matrix multiplication.
    • Recall that in matrix multiplication, the output is derived from the dot product of each row of the first matrix with each column of the second matrix. Ensure you multiply in the right order.
  3. Bias Addition:

    • Simply perform an element-wise addition of the result from the matrix multiplication with the bias.
  4. Applying Activation:

    • Use the sigmoid activation function on the outcome from the bias addition to get the output of the neuron.
    • TensorFlow provides the sigmoid function as tf.sigmoid().

Note

In the end of the course we'll delve into implementing a complete feed-forward network using TensorFlow. This exercise is setting the foundation for that.

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

Single Neural Network Layer

In a basic feed-forward neural network, the output of a neuron in a layer is calculated using the weighted sum of its inputs, passed through an activation function. This can be represented as:

output = activation(inputs * weights + bias)

Where:

  • weights: A matrix representing the weights associated with connections to the neuron;
  • inputs: A column matrix (or vector) representing the input values to the neuron;
  • bias: A scalar value;
  • activation: An activation function, such as the sigmoid function.

To achieve the best performance, all calculations are performed using matrices. We will cope with this task in the same way.

Task
test

Swipe to show code editor

Given weights, inputs, and bias for a single neuron layer, compute its output using matrix multiplication and the sigmoid activation function. Consider a layer with 3 inputs and 2 neurons, taking a single batch containing just one sample.

  1. Determining Shapes:

    • The shape of the input matrix I should have its first dimension representing the number of samples in the batch. Given one sample with 3 inputs, its size will be 1x3.
    • The weight matrix W should have its columns representing input weights for each neuron. So for 2 neurons with 3 inputs, the expected shape is 3x2. This isn't the case, so you need to transpose the weight matrix to achieve the required shape.
  2. Matrix Multiplication:

    • With the matrices in the correct shape, perform the matrix multiplication.
    • Recall that in matrix multiplication, the output is derived from the dot product of each row of the first matrix with each column of the second matrix. Ensure you multiply in the right order.
  3. Bias Addition:

    • Simply perform an element-wise addition of the result from the matrix multiplication with the bias.
  4. Applying Activation:

    • Use the sigmoid activation function on the outcome from the bias addition to get the output of the neuron.
    • TensorFlow provides the sigmoid function as tf.sigmoid().

Note

In the end of the course we'll delve into implementing a complete feed-forward network using TensorFlow. This exercise is setting the foundation for that.

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Section 1. Chapter 10
Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
We're sorry to hear that something went wrong. What happened?
some-alt