How to predict future data after training svm ?

9 views (last 30 days)
I have trained and exported a svm model using the classification learner app. But when i try to predict label on new data i am failing. My code for training is as below :
function obj = trainClassifier(obj,datasetFolder)
training = fullfile(datasetFolder,'train.mat');
meta = fullfile(datasetFolder , 'meta.mat');
save('meta.mat','meta');
dataset = load(training);
table = array2table(dataset.data);
variableNames = table.Properties.VariableNames;
table.class = dataset.labels;
predictorNames = {'data1', 'data2', 'data3', 'data4', 'data5', 'data6', 'data7', 'data8', 'data9', 'data10', 'data11', 'data12', 'data13', 'data14', 'data15', 'data16', 'data17', 'data18', 'data19', 'data20', 'data21', 'data22', 'data23', 'data24', 'data25', 'data26', 'data27', 'data28', 'data29', 'data30', 'data31', 'data32', 'data33', 'data34', 'data35', 'data36', 'data37', 'data38', 'data39', 'data40', 'data41', 'data42', 'data43', 'data44', 'data45', 'data46', 'data47', 'data48', 'data49', 'data50', 'data51', 'data52', 'data53', 'data54', 'data55', 'data56', 'data57', 'data58', 'data59', 'data60', 'data61', 'data62', 'data63', 'data64', 'data65', 'data66', 'data67', 'data68', 'data69', 'data70', 'data71', 'data72', 'data73', 'data74', 'data75', 'data76', 'data77', 'data78', 'data79', 'data80', 'data81', 'data82', 'data83', 'data84', 'data85', 'data86', 'data87', 'data88', 'data89', 'data90', 'data91', 'data92', 'data93', 'data94', 'data95', 'data96', 'data97', 'data98', 'data99', 'data100', 'data101', 'data102', 'data103', 'data104', 'data105', 'data106', 'data107', 'data108', 'data109', 'data110', 'data111', 'data112', 'data113', 'data114', 'data115', 'data116', 'data117', 'data118', 'data119', 'data120', 'data121', 'data122', 'data123', 'data124', 'data125', 'data126', 'data127', 'data128', 'data129', 'data130', 'data131', 'data132', 'data133', 'data134', 'data135', 'data136', 'data137', 'data138', 'data139', 'data140', 'data141', 'data142', 'data143', 'data144'};
predictors = table(:, variableNames);
response = table.class;
isCategoricalPredictor = [false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false];
% Train a classifier
% This code specifies all the classifier options and trains the classifier.
template = templateSVM(...
'KernelFunction', 'polynomial', ...
'PolynomialOrder', 2, ...
'KernelScale', 'auto', ...
'BoxConstraint', 1, ...
'Standardize', true);
options = statset('UseParallel',true);
classificationSVM = fitcecoc(...
predictors, ...
response, ...
'Learners', template, ...
'Coding', 'onevsone', ...
'Options', options , ...
'ClassNames', [1; 2; 3; 4; 5; 6; 7; 8; 9; 10; 11; 12; 13; 14; 15; 16; 17; 18; 19; 20; 21; 22; 23; 24; 25; 26; 27; 28; 29; 30; 31; 32; 33; 34; 35; 36; 37] , ...
'Verbose',2);
% Create the result struct with predict function
predictorExtractionFcn = @(t) t(:, predictorNames);
svmPredictFcn = @(x) predict(classificationSVM, x);
obj.network.predictFcn = @(x) svmPredictFcn(predictorExtractionFcn(x));
% Add additional fields to the result struct
obj.network.RequiredVariables = {'data1', 'data2', 'data3', 'data4', 'data5', 'data6', 'data7', 'data8', 'data9', 'data10', 'data11', 'data12', 'data13', 'data14', 'data15', 'data16', 'data17', 'data18', 'data19', 'data20', 'data21', 'data22', 'data23', 'data24', 'data25', 'data26', 'data27', 'data28', 'data29', 'data30', 'data31', 'data32', 'data33', 'data34', 'data35', 'data36', 'data37', 'data38', 'data39', 'data40', 'data41', 'data42', 'data43', 'data44', 'data45', 'data46', 'data47', 'data48', 'data49', 'data50', 'data51', 'data52', 'data53', 'data54', 'data55', 'data56', 'data57', 'data58', 'data59', 'data60', 'data61', 'data62', 'data63', 'data64', 'data65', 'data66', 'data67', 'data68', 'data69', 'data70', 'data71', 'data72', 'data73', 'data74', 'data75', 'data76', 'data77', 'data78', 'data79', 'data80', 'data81', 'data82', 'data83', 'data84', 'data85', 'data86', 'data87', 'data88', 'data89', 'data90', 'data91', 'data92', 'data93', 'data94', 'data95', 'data96', 'data97', 'data98', 'data99', 'data100', 'data101', 'data102', 'data103', 'data104', 'data105', 'data106', 'data107', 'data108', 'data109', 'data110', 'data111', 'data112', 'data113', 'data114', 'data115', 'data116', 'data117', 'data118', 'data119', 'data120', 'data121', 'data122', 'data123', 'data124', 'data125', 'data126', 'data127', 'data128', 'data129', 'data130', 'data131', 'data132', 'data133', 'data134', 'data135', 'data136', 'data137', 'data138', 'data139', 'data140', 'data141', 'data142', 'data143', 'data144'};
obj.network.ClassificationSVM = classificationSVM;
obj.network.About = 'This struct is a trained model exported from Classification Learner R2018a.';
obj.network.HowToPredict = sprintf('To make predictions on a new table, T, use: \n yfit = c.predictFcn(T) \nreplacing ''c'' with the name of the variable that is this struct, e.g. ''trainedModel''. \n \nThe table, T, must contain the variables returned by: \n c.RequiredVariables \nVariable formats (e.g. matrix/vector, datatype) must match the original training data. \nAdditional variables are ignored. \n \nFor more information, see <a href="matlab:helpview(fullfile(docroot, ''stats'', ''stats.map''), ''appclassification_exportmodeltoworkspace'')">How to predict using an exported model</a>.');
% Perform cross-validation
partitionedModel = crossval(obj.network.ClassificationSVM, 'KFold', 10, 'Options',options);
% Compute validation predictions
[validationPredictions, validationScores] = kfoldPredict(partitionedModel,'Options',options);
% Compute validation accuracy
validationAccuracy = 1 - kfoldLoss(partitionedModel, 'LossFun', 'ClassifError');
waitbar(0.75,f,'Name','Training',sprintf('Classifier has validation accuracy of %2.2f.',validationAccuracy));
pause(0.5)
%set network to use accross tha gui app
setappdata(f,'modal',obj.network);
% to save the modal in current directory for later use
obj.saveModal('modal',obj.network);
delete(f);
end
and to predict new data i am doing this :
function annotations = annotate(obj,featuresArray)
f = figure;
annotations = [];
tf = isappdata(f,'modal');
if tf == 0
obj.network = load('modal.mat');
else
obj.network = getappdata(f,'modal');
end
delete(f);
table = array2table(featuresArray);
variableNames = table.Properties.VariableNames;
f = waitbar(0,'1','Name','Annotating Frames',...
'CreateCancelBtn','setappdata(gcbf,''canceling'',1)');
s = size(featuresArray);
for i=1:s
flag_cancel = getappdata(f, 'canceling');
if flag_cancel
waitbar(1,f,'Canceled');
F = findall(0,'type','figure','tag','TMWWaitbar');
delete(F);
return
end
annotations = [annotations ; obj.network.classifier.predictFcn(table(i,variableNames))];
value = i/s;
waitbar(value,f, sprintf('%3.1f percent completed ',value*100));
end
delete(f);
end
Kindly please help me !
UPDATE
I solved the previous problem. One more question after
partitionedModel = crossval(obj.network.ClassificationSVM, 'KFold', 10 , 'Options',options);
statement should i save the partitioned model for further use or the "obj.network.ClassificationSVM" object in my code ? please clarify it

Accepted Answer

Muhammad Usama Sharaf SAAFI
[imagepred, probabilities] = predict(trainedModel.ClassificationSVM,imagefeatures3);
use this command for prediction.
  1 Comment
Ali Yar Khan
Ali Yar Khan on 5 Feb 2020
thanks for this . I have an update about the question kindly check it again short note is written at bottom. also i want to get total average no of suggestions for each fold , misclassified , correct classified and sample error on each fold ?

Sign in to comment.

More Answers (0)

Tags

Products


Release

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!