Classification Space
Controls
Step: 0 / 0
Perceptron
The perceptron is a linear classifier that learns a separating line by adjusting weights after each mistake. It converges on linearly separable data, but will keep oscillating on non-separable datasets.
How to Use
- Press Play to train step-by-step
- Adjust learning rate to change update size
- Switch datasets to see convergence vs. failure
- Toggle margin/vector for geometric intuition
Training Loop
- Compute activation
a = w·x + b - Predict
ŷ = sign(a) - If wrong, update
w ← w + η(y - ŷ)x - Redraw the boundary
On non-separable data, mistakes never reach zero.
Current Step
- Activation: compute
w·x + b - Prediction: classify the sample
- Update: adjust weights if needed
- Redraw: update boundary and metrics
Decision Boundary
The boundary is the line where w·x + b = 0. The weight vector w
is normal to the line. The margin is proportional to 1 / ||w||.
Metrics
| Epoch | 0 |
| Step | 0 |
| Mistakes | 0 |
| Accuracy | 0% |
| Weights | [-, -] |
| Bias | 0.00 |
| Status | Ready |
Sample: -
Activation: -
Prediction: -
Target: -