File Exchange

image thumbnail

Support Vector Machine

version (204 KB) by Bhartendu
SVM (Linearly Seperable Data) using linear Kernel with Gradient ascent


Updated 28 May 2017

View License

Refer: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods by Nello Cristianini and John Shawe-Taylor]
In this demo: training or cross-validation of a support vector machine (SVM) model for two-class (binary) classification on a low dimensional data set.

The training algorithm only depend on the data through dot products in H, i.e. on functions of the form Φ(x_i)·Φ(x_j). Now if there were a “kernel function” K such that
K(x_i,x_j) = Φ(x_i)·Φ(x_j),
we would only need to use K in the training algorithm, and would never need to explicitly even know what Φ is. One example is radial basis functions (RBF) or gaussian kernels where, H is infinite dimensional, so it would not be very easy to work with Φ explicitly.

Training the model requires the choice of:
• the kernel function, that determines the shape of the decision surface
• parameters in the kernel function (eg: for gaussian kernel:variance of the Gaussian, for polynomial kernel: degree of the polynomial)
• the regularization parameter λ.

Related Examples:
1. AdaBoost

2. SVM using various kernels

3. SVM for nonlinear classification

4. SMO

Cite As

Bhartendu (2021). Support Vector Machine (, MATLAB Central File Exchange. Retrieved .

Comments and Ratings (21)

Thang Bui Quy

Jose Lopes de Jesus


what are y_actual and y_predicted

Amaresh Singh

Mohamed Farchi


@Matthys Holdout is a method of CV (Cross Validation) partition.

joachim Matthys

Can someone tell me what the function of the holdout function is?

fatima farooq

Inturi srivani

Victor Paduani

ammar noori

Dear Bhartendu
do you have the documentation that describe your work....your input is highly appreciated




Thanks @Yeonjong, the two errors are probably due to mismatch of MatLab versions.


I run into two errors while I run this code.
For me, the following changes work very well.

1. In grad-Ascend,
w1=(alp_old.*Y).*X; ==> w1=(alp_old.*Y)'*X;
w2=(alpha.*Y).*X; ==> w2=(alpha.*Y)'*X;

2. Plotting
syms x
I changed to the following and it works for me.
xItv = linspace(-5,5,1000);
fn = @(x) vpa((-bias-W(1)*x)/W(2),4);
fn1 = @(x) vpa((1-bias-W(1)*x)/W(2),4);
fn2 = @(x) vpa((-1-bias-W(1)*x)/W(2),4);

John Martin


What is the reason for your poor rating nhat truong??

nhat truong

sagar kumar dash

sagar kumar dash

MATLAB Release Compatibility
Created with R2015a
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!