Classifying data with a linear SVM
In the first chapter, we saw some examples of classification with SVMs. We focused on SVMs' slightly superior classification performance compared to logistic regression, but for the most part, we left SVMs alone.
Here, we will focus on them more closely. While SVMs do not have an easy probabilistic interpretation, they do have an easy visual-geometric one. The main idea behind linear SVMs is to separate two classes with the best possible plane.
Let's linearly separate two classes with an SVM.
Getting ready
Let us start by loading and visualizing the iris dataset available in scikit-learn:
Load the data
Load part of the iris dataset. This will allow for easy comparison with the first chapter:
#load the libraries we have been using import numpy as np import pandas as pd import matplotlib.pyplot as plt #Library for visualization from sklearn import datasets iris = datasets.load_iris() X_w = iris.data[:, :2] #load the first two features of the iris...