Loading

Quipoin Menu

Learn • Practice • Grow

machine-learning / Support Vector Machines
tutorial

Support Vector Machines

Support Vector Machines (SVM) find the optimal boundary (hyperplane) that separates classes with the maximum margin. The points closest to the boundary are called support vectors.

SVM maximizes the margin between classes, making it robust to new data.

Linear SVM

When data is linearly separable, SVM finds a straight line (2D) or plane (3D) that separates classes with the widest possible margin.
from sklearn.svm import SVC

svm = SVC(kernel='linear', C=1.0)
svm.fit(X_train, y_train)

Non‑linear SVM (Kernel Trick)

When data is not linearly separable, SVM uses kernels to map data into higher dimensions where separation becomes possible. Common kernels: polynomial, RBF (radial basis function).
svm = SVC(kernel='rbf', gamma='scale', C=1.0)

Hyperparameters

  • C: Regularization – small C allows misclassifications (softer margin), large C tries to classify all correctly (hard margin).
  • gamma (for RBF): influence of a single training example – large gamma captures fine details (risk of overfitting).

When to Use SVM

Effective for high‑dimensional spaces, memory efficient, works well with clear margin of separation. Not ideal for very large datasets (slow training).


Two Minute Drill
  • SVM finds the maximum margin hyperplane.
  • Kernel trick handles non‑linear data.
  • C controls margin hardness; gamma controls influence.
  • Use SVC from sklearn.svm.

Need more clarification?

Drop us an email at career@quipoinfotech.com