Classification Space
Click to add data points

Controls

10.0
1.00
Ready

Support Vector Machines

SVMs find the hyperplane that maximizes the margin between classes. Points on the margin boundaries are support vectors — they alone determine the decision boundary.

How to Use

  • Click canvas to add points (select class first)
  • Choose a dataset to explore different patterns
  • Press Train to fit the SVM
  • Adjust C to control soft margin trade-off
  • Switch kernels for non-linear boundaries

SVM Optimization

SVM solves a constrained optimization problem to find the maximum margin hyperplane.

  1. Map data to feature space (via kernel)
  2. Find hyperplane maximizing margin
  3. Identify support vectors (on margin boundary)
  4. Classify new points by which side they fall on

Kernel Trick

Kernels compute inner products in high-dimensional space without explicit transformation.

  • Linear: K(x,y) = x·y
  • RBF: K(x,y) = exp(−γ||x−y||²)
  • Polynomial: K(x,y) = (x·y + 1)ᵈ

Primal Formulation

min ½||w||² + C Σᵢ ξᵢ

Minimize weight norm (maximize margin) with slack variables ξᵢ for soft margin.

Constraints

yᵢ(w·xᵢ + b) ≥ 1 − ξᵢ

Each point must be on correct side (within slack).

C Parameter

Controls the trade-off between margin width and misclassification:

  • Large C: Narrow margin, fewer errors (overfit risk)
  • Small C: Wide margin, more errors (underfit risk)

Margin Width

margin = 2 / ||w||

Distance between the two margin boundaries.

SVM Metrics

Points 0
Support Vectors -
Margin Width -
Accuracy -
Kernel Linear
Status Add points to begin