Regression Space
Click to add data points

Controls

Ready

Linear Regression

Linear regression fits a straight line y = mx + b to data by minimizing the sum of squared residuals. It is the foundation of many ML algorithms.

How to Use

  • Click canvas to add data points
  • Choose a dataset preset to explore patterns
  • Press Fit to compute the best-fit line
  • Switch to Gradient Descent to animate training
  • Toggle residuals to see error distances

Normal Equation

The closed-form solution computes optimal parameters directly:

w = (XTX)-1XTy

Solves in one step, no iteration needed.

Gradient Descent

Iteratively updates parameters by following the gradient:

w ← w − η · ∂L/∂w

Where η is the learning rate and L is MSE loss.

  1. Initialize weights randomly
  2. Compute predictions ŷ = Xw
  3. Compute loss L = (1/n)Σ(y − ŷ)²
  4. Compute gradient ∂L/∂w
  5. Update weights w ← w − η·∂L/∂w
  6. Repeat until convergence

Mean Squared Error

MSE = (1/n) Σᵢ (yᵢ − ŷᵢ)²

Average squared difference between actual and predicted values.

R² Score

R² = 1 − SS_res / SS_tot

Proportion of variance explained by the model. R²=1 is perfect fit.

Gradient

∂MSE/∂m = −(2/n) Σᵢ xᵢ(yᵢ − ŷᵢ)

Slope gradient — direction to update m.

∂MSE/∂b = −(2/n) Σᵢ (yᵢ − ŷᵢ)

Intercept gradient — direction to update b.

Fit Metrics

Points 0
Slope (m) -
Intercept (b) -
MSE -
-
Iteration -
Status Add points to begin