Challenge: Compare Convergence Speed
Compito
Swipe to start coding
You will simulate gradient descent on a simple linear regression problem to compare how feature scaling affects convergence speed.
Steps:
- Generate synthetic data
X(one feature) andyusing the relationy = 3 * X + noise. - Implement a simple gradient descent function that minimizes MSE loss:
def gradient_descent(X, y, lr, steps): w = 0.0 history = [] for _ in range(steps): grad = -2 * np.mean(X * (y - w * X)) w -= lr * grad history.append(w) return np.array(history) - Run gradient descent twice:
- on the original X,
- and on the standardized X_scaled = (X - mean) / std.
- Plot or print the loss decrease for both to see that scaling accelerates convergence.
- Compute and print final weights and losses for both cases.
Soluzione
Tutto è chiaro?
Grazie per i tuoi commenti!
Sezione 4. Capitolo 4
single
Chieda ad AI
Chieda ad AI
Chieda pure quello che desidera o provi una delle domande suggerite per iniziare la nostra conversazione
Suggested prompts:
Can you explain this in simpler terms?
What are the main points I should remember?
Can you give me an example?
Awesome!
Completion rate improved to 5.26
Challenge: Compare Convergence Speed
Scorri per mostrare il menu
Compito
Swipe to start coding
You will simulate gradient descent on a simple linear regression problem to compare how feature scaling affects convergence speed.
Steps:
- Generate synthetic data
X(one feature) andyusing the relationy = 3 * X + noise. - Implement a simple gradient descent function that minimizes MSE loss:
def gradient_descent(X, y, lr, steps): w = 0.0 history = [] for _ in range(steps): grad = -2 * np.mean(X * (y - w * X)) w -= lr * grad history.append(w) return np.array(history) - Run gradient descent twice:
- on the original X,
- and on the standardized X_scaled = (X - mean) / std.
- Plot or print the loss decrease for both to see that scaling accelerates convergence.
- Compute and print final weights and losses for both cases.
Soluzione
Tutto è chiaro?
Grazie per i tuoi commenti!
Sezione 4. Capitolo 4
single