Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Single Neuron Implementation | Neural Network from Scratch
Introduction to Neural Networks

Single Neuron ImplementationSingle Neuron Implementation

The fundamental computational unit of a neural network is the neuron. A neuron can be visualized as a small processing unit that takes multiple inputs, processes them, and gives a single output.

Here's what happens step by step:

  1. Each input is multiplied by a corresponding weight. The weights are learnable parameters and they determine the importance of the corresponding input;
  2. All the weighted inputs are summed together;
  3. In our implementation, we will also add an additional parameter called bias to the input sum. The bias allows the neuron to shift its output up or down, adding flexibility to the modeling capability;
  4. Then the input sum is passed through an activation function. We are using the sigmoid function, which squashes values into the range (0, 1).

Note

Bias of the neuron is also a trainable parameter.

Завдання

Implement the basic structure of a neuron. Complete the missing parts of the neuron class:

  1. Enter the number of inputs of the neuron.
  2. Use the uniform function to generate a random bias for every neuron.
  3. Enter the activation function of the neuron.

Once you've completed this task, click the button below the code to check your solution.

Все було зрозуміло?

Секція 2. Розділ 1
toggle bottom row
course content

Зміст курсу

Introduction to Neural Networks

Single Neuron ImplementationSingle Neuron Implementation

The fundamental computational unit of a neural network is the neuron. A neuron can be visualized as a small processing unit that takes multiple inputs, processes them, and gives a single output.

Here's what happens step by step:

  1. Each input is multiplied by a corresponding weight. The weights are learnable parameters and they determine the importance of the corresponding input;
  2. All the weighted inputs are summed together;
  3. In our implementation, we will also add an additional parameter called bias to the input sum. The bias allows the neuron to shift its output up or down, adding flexibility to the modeling capability;
  4. Then the input sum is passed through an activation function. We are using the sigmoid function, which squashes values into the range (0, 1).

Note

Bias of the neuron is also a trainable parameter.

Завдання

Implement the basic structure of a neuron. Complete the missing parts of the neuron class:

  1. Enter the number of inputs of the neuron.
  2. Use the uniform function to generate a random bias for every neuron.
  3. Enter the activation function of the neuron.

Once you've completed this task, click the button below the code to check your solution.

Все було зрозуміло?

Секція 2. Розділ 1
toggle bottom row
some-alt