Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Challenge: Creating a Perceptron | Neural Network from Scratch
Introduction to Neural Networks

bookChallenge: Creating a Perceptron

Since our goal is to implement a multilayer perceptron, creating a Perceptron class will simplify model initialization. Its only attribute, layers is essentially a list of the Layer objects that define the structure of the network:

class Perceptron:
    def __init__(self, layers):
        self.layers = layers

The variables used to initialize the layers are the following:

  • input_size: the number of input features;
  • hidden_size: the number of neurons in each hidden layer (both hidden layers will have the same number of neurons in this case);
  • output_size: the number of neurons in the output layer.

The structure of the resulting perceptron should be as follows:

Task

Swipe to start coding

Your goal is to set up the basic structure of the perceptron by implementing its layers:

  1. Complete the layer initialization (__init__() method):

    • Initialize the weights matrix (the shape is(n_neurons, n_neurons));
    • Initialize the biases vector (the shape is (n_neurons, 1)).

    Fill them with random values from a uniform distribution in range [βˆ’1,1)[-1, 1). Use the np.random.uniform() function to do this.

  2. Complete the layer's forward propagation (forward() method):

    • Compute the raw output values of the neurons. Use the np.dot() function for dot product;
    • Apply the activation function to the raw outputs and return the result.
  3. Define three layers:

    • Two hidden layers: each layer should have hidden_size neurons and use the relu activation function;
    • One output layer: it should use the sigmoid activation function.

Solution

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 2. ChapterΒ 4
single

single

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

Suggested prompts:

Can you explain how to define the Layer class?

How do I initialize the Perceptron with specific layer sizes?

What is the purpose of having multiple hidden layers in this structure?

close

Awesome!

Completion rate improved to 4

bookChallenge: Creating a Perceptron

Swipe to show menu

Since our goal is to implement a multilayer perceptron, creating a Perceptron class will simplify model initialization. Its only attribute, layers is essentially a list of the Layer objects that define the structure of the network:

class Perceptron:
    def __init__(self, layers):
        self.layers = layers

The variables used to initialize the layers are the following:

  • input_size: the number of input features;
  • hidden_size: the number of neurons in each hidden layer (both hidden layers will have the same number of neurons in this case);
  • output_size: the number of neurons in the output layer.

The structure of the resulting perceptron should be as follows:

Task

Swipe to start coding

Your goal is to set up the basic structure of the perceptron by implementing its layers:

  1. Complete the layer initialization (__init__() method):

    • Initialize the weights matrix (the shape is(n_neurons, n_neurons));
    • Initialize the biases vector (the shape is (n_neurons, 1)).

    Fill them with random values from a uniform distribution in range [βˆ’1,1)[-1, 1). Use the np.random.uniform() function to do this.

  2. Complete the layer's forward propagation (forward() method):

    • Compute the raw output values of the neurons. Use the np.dot() function for dot product;
    • Apply the activation function to the raw outputs and return the result.
  3. Define three layers:

    • Two hidden layers: each layer should have hidden_size neurons and use the relu activation function;
    • One output layer: it should use the sigmoid activation function.

Solution

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 2. ChapterΒ 4
single

single

some-alt