top of page

Support Vector Machine

  • 작성자 사진: Shin Yoonah, Yoonah
    Shin Yoonah, Yoonah
  • 2022년 8월 3일
  • 2분 분량

최종 수정일: 2022년 8월 8일


Support vector machine is used for classification


Lessons

  1. Kernels

  2. Maximum Margin


Kernel

Kernel is used when the data sets were not linearly separable

You can see that the data sets are not linearly separated


Data transformation

It makes a space where it's linearly separable


Imagine our dataset is 1-dimensional data (has only one feature)

Transfer it into 2-dimensional space

It increases the dimension of data by mapping x into a new space using a function

Then, the data set is linearly separable

In a 2-dimensional space, the hyperplane is a line dividing a plane into two parts where each class lays on either side


Non-Linear Mapping

- Sometimes its difficult to calculate the mapping

- Use shortcut called a kernel

Kernel types: Linear, Polynomial, Radial Basis Foundation (RBF)


RBF kernel - Gamma controls the shape of the kernel

It finds the difference between two inputs x and x' that is called a support vector


How we select Gamma?

Using a value of gamma of 0.01 increases the flexibility of the classifier

-> Therefore higher gamma the more likely we will over fit


For the picture which it is mislabeled, the image points will appear in the incorrect region

Fitting the model with a high value of gamma, performs almost perfect on the training points


Find the best value of gamma by using validation data

- Split the data into training and validation sets by using the validation samples to find the hyper-

parameters

- Try several different values of gamma and select the value that does the best on the validation data


Maximum Margin

SVM works by finding the maximum margin


SVM: Best Line

How do we find the best line?

- Basically, SVMs are based on the idea of finding a plane that best divides a dataset into two classes

The best hyperplane is the one that represent the largest separation, or margin, between the two classes

Goal: Choose a hyperplane with as big a margin as possible


Support vectors: examples closest to the hyperplane

Only this matter for achieving our goal; and thus, other training examples can be ignored


Try to find the hyperplane in such a way that it has the maximum distance to support vectors


Soft Margin SVM

- When the classes are not separable, the soft margin SVM can be used

- Usually controlled by the regulation parameter C


Copyright Coursera All rights reserved

Comments


bottom of page