Subscribe to get access
??Subscribe to read the rest of the comics, the fun you can’t miss ??
Some popular types of kernels in SVM:
1. Linear Kernel
- Formula:
- Use When:
The data is linearly separable.
The number of features is very high (e.g., text classification problems).
You need a simple and fast model. - Advantages:
Simplicity.
Efficiency for large datasets.
2. Polynomial Kernel
- Formula:
is a free parameter (default is 1).
is a coefficient to trade off the influence of higher-order versus lower-order terms (default is 0).
is the degree of the polynomial.
- Use When:
The data has interactions between features (not just linear relationships).
You want to model higher-dimensional feature interactions. - Parameters to Tune:
Degree of the polynomial ().
Coefficients (and
).
3. Radial Basis Function (RBF) Kernel (Gaussian Kernel)
- Formula:
Here,is a free parameter (default is
).
- Use When:
The data is not linearly separable.
You need a flexible and powerful kernel that can handle complex relationships. - Parameters to Tune:
Gamma (): Defines how far the influence of a single training example reaches. Low values mean ‘far’ and high values mean ‘close’.
4. Sigmoid Kernel
- Formula:
Here,is a free parameter (default is
).
is a coefficient (default is 0).
Visualizing the decision boundaries
To visualize the decision boundaries, we’ll use a two-feature subset of the Iris dataset for simplicity. We’ll plot the decision boundaries for each kernel: linear, polynomial, RBF, and sigmoid. Below is the complete code to achieve this.
Importing Necessary Libraries
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from sklearn import datasets
from sklearn.model_selection import train_test_split
from sklearn.svm import SVC
from matplotlib.colors import ListedColormap
Loading the Dataset
We use only the first two features of the Iris dataset for visualization.
# Load the iris dataset
iris = datasets.load_iris()
X = iris.data[:, :2] # Use only the first two features for visualization
y = iris.target
# Split the dataset into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)
Function to Plot Decision Boundaries
The following plot_decision_boundaries
creates a mesh grid, predicts classifications for each point, and plots the decision boundaries.
def plot_decision_boundaries(X, y, classifier, ax, title):
# Define the min and max for X and Y
x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1
y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1
# Create a meshgrid
xx, yy = np.meshgrid(np.arange(x_min, x_max, 0.01),
np.arange(y_min, y_max, 0.01))
# Predict classifications for each point in the meshgrid
Z = classifier.predict(np.c_[xx.ravel(), yy.ravel()])
Z = Z.reshape(xx.shape)
# Plot the decision boundary
ax.contourf(xx, yy, Z, alpha=0.3, cmap=ListedColormap(('red', 'green', 'blue')))
ax.scatter(X[:, 0], X[:, 1], c=y, s=20, edgecolor='k', cmap=ListedColormap(('red', 'green', 'blue')))
ax.set_title(title)
Training and Plotting for Each Kernel
Now, for each kernel, we train an SVM classifier and plot its decision boundary.
# Initialize the plot
fig, axs = plt.subplots(2, 2, figsize=(12, 10))
# Linear Kernel
linear_svm = SVC(kernel='linear')
linear_svm.fit(X_train, y_train)
plot_decision_boundaries(X_train, y_train, linear_svm, axs[0, 0], "Linear Kernel")
# Polynomial Kernel
poly_svm = SVC(kernel='poly', degree=3, gamma='auto')
poly_svm.fit(X_train, y_train)
plot_decision_boundaries(X_train, y_train, poly_svm, axs[0, 1], "Polynomial Kernel")
# RBF Kernel
rbf_svm = SVC(kernel='rbf', gamma='auto')
rbf_svm.fit(X_train, y_train)
plot_decision_boundaries(X_train, y_train, rbf_svm, axs[1, 0], "RBF Kernel")
# Sigmoid Kernel
sigmoid_svm = SVC(kernel='sigmoid', gamma='auto')
sigmoid_svm.fit(X_train, y_train)
plot_decision_boundaries(X_train, y_train, sigmoid_svm, axs[1, 1], "Sigmoid Kernel")
# Adjust layout
plt.tight_layout()
plt.show()
Output:
The result helps visualize how each kernel separates the data:

Discover more from Science Comics
Subscribe to get the latest posts sent to your email.