Epoch 0

Perceptron

The perceptron is a linear classifier that learns a separating line by adjusting weights after each mistake. It converges on linearly separable data, but will keep oscillating on non-separable datasets.

How to Use

  • Press Play to train step-by-step
  • Adjust learning rate to change update size
  • Switch datasets to see convergence vs. failure
  • Toggle margin/vector for geometric intuition

Training Loop

  1. Compute activation a = w·x + b
  2. Predict ŷ = sign(a)
  3. If wrong, update w ← w + η(y - ŷ)x
  4. Redraw the boundary

On non-separable data, mistakes never reach zero.

Current Step

  1. Activation: compute w·x + b
  2. Prediction: classify the sample
  3. Update: adjust weights if needed
  4. Redraw: update boundary and metrics
0.005

Decision Boundary

The boundary is the line where w·x + b = 0. The weight vector w is normal to the line. The margin is proportional to 1 / ||w||.

Epoch0
Step0
Mistakes0
Accuracy0%
Weights[-, -]
Bias0.00
StatusReady
Sample: -
Activation: -
Prediction: -
Target: -