Equation
Linear
y
=
0.00x
+
5.00
Linear Regression
Linear regression fits a straight line y = mx + b to data by minimizing
the sum of squared residuals. It is the foundation of many ML algorithms.
How to Use
- Click canvas to add data points
- Choose a dataset preset to explore patterns
- Press Fit to compute the best-fit line
- Switch to Gradient Descent to animate training
Normal Equation
The closed-form solution computes optimal parameters directly:
w = (XTX)-1XTy
Solves in one step, no iteration needed.
Gradient Descent
Iteratively updates parameters by following the gradient:
w ← w − η · ∂L/∂w
Where η is the learning rate and L is MSE loss.
- Initialize weights randomly
- Compute predictions ŷ = Xw
- Compute loss L = (1/n)Σ(y − ŷ)²
- Compute gradient ∂L/∂w
- Update weights w ← w − η·∂L/∂w
- Repeat until convergence
Live Calculation
Fit a line to see the calculation breakdown.