Challenge: Fitting a Line with Gradient Descent
Swipe to start coding
A student wants to use gradient descent to fit a straight line to a dataset showing years of experience versus salary (in thousands). The goal is to find the best-fitting line by adjusting the slope (m) and intercept (b) using iterative updates.
You need to minimize the loss function:
n1βi=1βnβ(yiββ(mxiβ+b))2The gradient descent update rules are:
mβmβΞ±βmβJβbβbβΞ±βbβJβWhere:
- Ξ± is the learning rate (step size);
- βmββJβ is the partial derivative of the loss function with respect to m;
- βbββJβ is the partial derivative of the loss function with respect to b.
Your task:
- Complete the Python code below to implement the gradient descent steps.
- Fill in missing expressions using basic Python operations.
- Track how
m
andb
change as the algorithm runs.
Solution
Thanks for your feedback!
single
Ask AI
Ask AI
Ask anything or try one of the suggested questions to begin our chat
Can you explain this in simpler terms?
What are the main points I should remember?
Can you give me an example?
Awesome!
Completion rate improved to 1.96
Challenge: Fitting a Line with Gradient Descent
Swipe to show menu
Swipe to start coding
A student wants to use gradient descent to fit a straight line to a dataset showing years of experience versus salary (in thousands). The goal is to find the best-fitting line by adjusting the slope (m) and intercept (b) using iterative updates.
You need to minimize the loss function:
n1βi=1βnβ(yiββ(mxiβ+b))2The gradient descent update rules are:
mβmβΞ±βmβJβbβbβΞ±βbβJβWhere:
- Ξ± is the learning rate (step size);
- βmββJβ is the partial derivative of the loss function with respect to m;
- βbββJβ is the partial derivative of the loss function with respect to b.
Your task:
- Complete the Python code below to implement the gradient descent steps.
- Fill in missing expressions using basic Python operations.
- Track how
m
andb
change as the algorithm runs.
Solution
Thanks for your feedback!
single