Info

This question is closed. Reopen it to edit or answer.

this two codes is connected to each other when i run the first code it gives me error says : Error using KNN_ Too few input arguments,what i shoud do

1 view (last 30 days)
This question was flagged by 2 contributors
global A trn vald ;
SearchAgents_no=10; % Number of search agents
Max_iteration=100; % Maximum numbef of iterations
% A=load ('C:\Users\d\Downloads\archive (1).dat');
%digitDatasetPath = fullfile('C:\Users\d\Downloads\archive (1)');
load('f')
A=featuresall;
nVar=size(featuresall,2)-1;
r=randperm(size(featuresall,1));
trn=r(1:floor(length(r)/2));
vald=r(floor(length(r)/2)+1:end);
tic
[Best_score,Best_pos,Convergence_curve]=BGWOPSO(SearchAgents_no,(Max_iteration),0,1,size(A,2)-1,'AccSz');
time = toc;
acc = Acc(Best_pos);
fprintf('hybrid Acc %f\thybrid Fitness: %f\thybridSolution: %s\thybridDimention: %d\thybridTime: %f\n',acc,Best_score,num2str(Best_pos,'%1d'),sum(Best_pos(:)),time);
%second code
function [predicted_labels,nn_index,accuracy_] = KNN_(k,data,labels,t_data,t_labels)
%KNN_: classifying using k-nearest neighbors algorithm. The nearest neighbors
%search method is euclidean distance
%Usage:
% [~,~,accuracy] = KNN_(3,training,training_labels,testing,testing_labels);
% predicted_labels = KNN_(3,training,training_labels,testing)
%Input:
% - k: number of nearest neighbors
% - data: (NxD) training data; N is the number of samples and D is the
% dimensionality of each data point
% - labels: training labels
% - t_data: (MxD) testing data; M is the number of data points and D
% is the dimensionality of each data point
% - t_labels: testing labels (default = [])
%Output:
% - predicted_labels: the predicted labels based on the k-NN
% algorithm
% - nn_index: the index of the nearest training data point for each training sample (Mx1).
% - accuracy: if the testing labels are supported, the accuracy of
% the classification is returned, otherwise it will be zero.
%Author: Mahmoud Afifi - York University
%checks
if nargin < 4
error('Too few input arguments.')
elseif nargin < 5
t_labels=[];
accuracy=0;
end
if size(data,2)~=size(t_data,2)
error('data should have the same dimensionality');
end
if mod(k,2)==0
error('to reduce the chance of ties, please choose odd k');
end
%initialization
predicted_labels=zeros(size(t_data,1),1);
ed=zeros(size(t_data,1),size(data,1)); %ed: (MxN) euclidean distances
ind=zeros(size(t_data,1),size(data,1)); %corresponding indices (MxN)
k_nn=zeros(size(t_data,1),k); %k-nearest neighbors for testing sample (Mxk)
%calc euclidean distances between each testing data point and the training
%data samples
for test_point=1:size(t_data,1)
for train_point=1:size(data,1)
%calc and store sorted euclidean distances with corresponding indices
ed(test_point,train_point)=sqrt(...
sum((t_data(test_point,:)-data(train_point,:)).^2));
end
[ed(test_point,:),ind(test_point,:)]=sort(ed(test_point,:));
end
%find the nearest k for each data point of the testing data
k_nn=ind(:,1:k);
nn_index=k_nn(:,1);
%get the majority vote
for i=1:size(k_nn,1)
options=unique(labels(k_nn(i,:)'));
max_count=0;
max_label=0;
for j=1:length(options)
L=length(find(labels(k_nn(i,:)')==options(j)));
if L>max_count
max_label=options(j);
max_count=L;
end
end
predicted_labels(i)=max_label;
end
%calculate the classification accuracy
if isempty(t_labels)==0
accuracy=length(find(predicted_labels==t_labels))/size(t_data,1);
end
  2 Comments
Christopher McCausland
Christopher McCausland on 13 Mar 2023
Essrra,
Please edit this to ask a question. No one can help if they don't know what you want/need help with.
Christopher
Rik
Rik on 14 Mar 2023
Edited: Rik on 14 Mar 2023
Have a read here and here. It will greatly improve your chances of getting an answer.
Also, did Mahmoud Afifi agree to have his code posted here?
If you have trouble with Matlab basics you may consider doing the Onramp tutorial (which is provided for free by Mathworks).

Answers (0)

This question is closed.

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!