Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Challenge: Fitting a Line with Gradient Descent | Mathematical Analysis
Mathematics for Data Science

bookChallenge: Fitting a Line with Gradient Descent

Task

Swipe to start coding

A student wants to use gradient descent to fit a straight line to a dataset showing years of experience versus salary (in thousands). The goal is to find the best-fitting line by adjusting the slope (mm) and intercept (bb) using iterative updates.

You need to minimize the loss function:

1nβˆ‘i=1n(yiβˆ’(mxi+b))2\frac{1}{n}\sum^n_{i=1}(y_i - (mx_i + b))^2

The gradient descent update rules are:

m←mβˆ’Ξ±βˆ‚Jβˆ‚mb←bβˆ’Ξ±βˆ‚Jβˆ‚bm \larr m - \alpha \frac{\partial J}{\partial m} \\[6 pt] b \larr b - \alpha \frac{\partial J}{\partial b}

Where:

  • Ξ±\alpha is the learning rate (step size);
  • βˆ‚Jβˆ‚m\frac{\raisebox{1pt}{$\partial J$}}{\raisebox{-1pt}{$\partial m$}} is the partial derivative of the loss function with respect to mm;
  • βˆ‚Jβˆ‚b\frac{\raisebox{1pt}{$\partial J$}}{\raisebox{-1pt}{$\partial b$}} is the partial derivative of the loss function with respect to bb.

Your task:

  1. Complete the Python code below to implement the gradient descent steps.
  2. Fill in missing expressions using basic Python operations.
  3. Track how m and b change as the algorithm runs.

Solution

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 3. ChapterΒ 11
single

single

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

Suggested prompts:

Can you explain this in simpler terms?

What are the main points I should remember?

Can you give me an example?

close

Awesome!

Completion rate improved to 1.96

bookChallenge: Fitting a Line with Gradient Descent

Swipe to show menu

Task

Swipe to start coding

A student wants to use gradient descent to fit a straight line to a dataset showing years of experience versus salary (in thousands). The goal is to find the best-fitting line by adjusting the slope (mm) and intercept (bb) using iterative updates.

You need to minimize the loss function:

1nβˆ‘i=1n(yiβˆ’(mxi+b))2\frac{1}{n}\sum^n_{i=1}(y_i - (mx_i + b))^2

The gradient descent update rules are:

m←mβˆ’Ξ±βˆ‚Jβˆ‚mb←bβˆ’Ξ±βˆ‚Jβˆ‚bm \larr m - \alpha \frac{\partial J}{\partial m} \\[6 pt] b \larr b - \alpha \frac{\partial J}{\partial b}

Where:

  • Ξ±\alpha is the learning rate (step size);
  • βˆ‚Jβˆ‚m\frac{\raisebox{1pt}{$\partial J$}}{\raisebox{-1pt}{$\partial m$}} is the partial derivative of the loss function with respect to mm;
  • βˆ‚Jβˆ‚b\frac{\raisebox{1pt}{$\partial J$}}{\raisebox{-1pt}{$\partial b$}} is the partial derivative of the loss function with respect to bb.

Your task:

  1. Complete the Python code below to implement the gradient descent steps.
  2. Fill in missing expressions using basic Python operations.
  3. Track how m and b change as the algorithm runs.

Solution

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 3. ChapterΒ 11
single

single

some-alt