Classification Space

Controls

0.20
Step: 0 / 0

Perceptron

The perceptron is a linear classifier that learns a separating line by adjusting weights after each mistake. It converges on linearly separable data, but will keep oscillating on non-separable datasets.

How to Use

  • Press Play to train step-by-step
  • Adjust learning rate to change update size
  • Switch datasets to see convergence vs. failure
  • Toggle margin/vector for geometric intuition

Training Loop

  1. Compute activation a = w·x + b
  2. Predict ŷ = sign(a)
  3. If wrong, update w ← w + η(y - ŷ)x
  4. Redraw the boundary

On non-separable data, mistakes never reach zero.

Current Step

  1. Activation: compute w·x + b
  2. Prediction: classify the sample
  3. Update: adjust weights if needed
  4. Redraw: update boundary and metrics

Decision Boundary

The boundary is the line where w·x + b = 0. The weight vector w is normal to the line. The margin is proportional to 1 / ||w||.

Metrics

Epoch 0
Step 0
Mistakes 0
Accuracy 0%
Weights [-, -]
Bias 0.00
Status Ready
Sample: -
Activation: -
Prediction: -
Target: -