## Building The Linear Regression Using NumPy

You already know what simple linear regression is and how to find the line that fits the data best. Let's go through all the steps of building a linear regression for a real dataset.

## Loading data

We have a file, `simple_height_data.csv`

, with the data from our examples. Let's load the file and take a look at it.

So the dataset has two columns, one is 'Height', which is our target, and the second column, 'Father', is the father's height. That is our feature.

Let's assign our target values to the `y`

variable and feature values to `X`

and build a scatterplot.

## Finding parameters

Now, NumPy has a nice function to find the parameters of linear regression.

Linear Regression is a Polynomial Regression of degree 1(we will talk about Polynomial Regression in later sections). That's why we need to put `deg=1`

to get the parameters for the linear regression.

Here is an example:

Note

If you are unfamiliar with the syntax

`beta_1, beta_0 = np.polyfit(X,y,1)`

, that is called unpacking.

If you have an iterator (e.g., list or NumPy array or pandas series) that has two items writingis the same as

And since the return of a

`polyfit()`

function is a NumPy array with two values, we are allowed to do that

## Making the predictions

Now we can plot the line and predict new variables using the parameters.

Now that we have the parameters, we can use the linear regression equation to predict new values.

So it is pretty easy to get the parameters of the linear regression. But some libraries can also give you some extra information. Let's look at one such library.

Everything was clear?

Course Content

# Linear Regression with Python

4. Choosing The Best Model

Linear Regression with Python

## Building The Linear Regression Using NumPy

You already know what simple linear regression is and how to find the line that fits the data best. Let's go through all the steps of building a linear regression for a real dataset.

## Loading data

We have a file, `simple_height_data.csv`

, with the data from our examples. Let's load the file and take a look at it.

So the dataset has two columns, one is 'Height', which is our target, and the second column, 'Father', is the father's height. That is our feature.

Let's assign our target values to the `y`

variable and feature values to `X`

and build a scatterplot.

## Finding parameters

Now, NumPy has a nice function to find the parameters of linear regression.

Linear Regression is a Polynomial Regression of degree 1(we will talk about Polynomial Regression in later sections). That's why we need to put `deg=1`

to get the parameters for the linear regression.

Here is an example:

Note

If you are unfamiliar with the syntax

`beta_1, beta_0 = np.polyfit(X,y,1)`

, that is called unpacking.

If you have an iterator (e.g., list or NumPy array or pandas series) that has two items writingis the same as

And since the return of a

`polyfit()`

function is a NumPy array with two values, we are allowed to do that

## Making the predictions

Now we can plot the line and predict new variables using the parameters.

Now that we have the parameters, we can use the linear regression equation to predict new values.

So it is pretty easy to get the parameters of the linear regression. But some libraries can also give you some extra information. Let's look at one such library.

Everything was clear?