Dimensions Reduction in Matlab using PCA

3 views (last 30 days)
Hi;
I have a matrix of 35 columns, and I'm trying to reduce the dimension using PCA.
I run PCA on my data:
[coeff,score,latent,tsquared,explained,mu] = pca(data);
explained =
99.9955
0.0022
0.0007
0.0003
0.0002
0.0001
0.0001
0.0001
Then, by looking at explained vector I notice the value of the first is 99, so based on this I decided to take only the first compoenet. So I did the follwoing:
k=1;
X = bsxfun(@minus, data, mean(data)) * coeff(:, 1:k);
and Now, I used X for SVM training,
svmStruct = fitcsvm(X,Y,'Standardize',true, 'Prior','uniform','KernelFunction','linear','KernelScale','auto','Verbose',0,'IterationLimit', 1000000);
However, when I tried to do predict and calculate the miss-classification rate:
[label,score,cost]= predict(svmStruct, X) ;
The result was disappointing. I notice, when I select only one component (k=1), I got all classification wrong,however, as I increase number of included component (k), result is improving, as you can see from below diagram, but this doesn't make since according to explained, I should be fine with the first eginvector only.
Did I do any mistake?
Below diagram shows classification error for each number of included eginvector.

Accepted Answer

Tom Lane
Tom Lane on 11 Nov 2015
The first component explains most of the variation in the columns of DATA, but Y is not involved in that. Of course I don't understand your data. But it is certainly mathematically possible for Y to depend more strongly on a component that explains a lower amount of variance.
Also, for simplicity you may find it convenient to use the SCORE output from PCA instead of computing it yourself.

More Answers (0)

Categories

Find more on Dimensionality Reduction and Feature Extraction in Help Center and File Exchange

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!