Brainmaker

Nanos gigantium humeris insidentes!

Support Vector Machines — Different Methods

  • May 31, 2010 11:22 pm

Digested from: http://www.dtreg.com/svm.htm

A Support Vector Machine (SVM) performs classification by constructing an N-dimensional hyperplane that optimally separates the data into two categories. Before considering N-dimensional hyperplanes, let’s look at a simple 2-dimensional example. Assume we wish to perform a classification, and our data has a categorical target variable with two categories.

The simplest way to divide two groups is with a straight line, flat plane or an N-dimensional hyperplane. But what if the points are separated by a nonlinear region. Rather than fitting nonlinear curves to the data, SVM handles this by using a kernel function to map the data into a different space where a hyperplane can be used to do the separation. The concept of a kernel mapping function is very powerful. It allows SVM models to perform separations even with very complex boundaries such as shown below.


The Kernel Trick


Many kernel mapping functions can be used – probably an infinite number. But a few kernel functions have been found to work well in for a wide variety of applications. The default and recommended kernel function is the Radial Basis Function (RBF).


Kernel functions supported by DTREG: Linear: u’*v



Polynomial: (gamma*u’*v + coef0)^degree



Radial basis function: exp(-gamma*|u-v|^2)



Sigmoid (feed-forward neural network): tanh(gamma*u’*v + coef0)





Print Friendly