Neural Network: How can I plot the classification Lines?
2 views (last 30 days)
Show older comments
Hello erverybody,
I'm a beginner in matlab and neural networks. I tried to build a neural network for classification (at the end of the page you can find the architectur).
You can see the code below.
My question is: How can I plot the lines, seperating the data in the plot?
Additionally: If I change, for example, the learning rate (coeff) the output will be always around 0.5. You know the reason behind that? Why is it the mean value (0,1)?
It is highly appreciated if you find mistakes or improvements in/for my code.
% Example if I change the coeff
out:
0.459
0.567
0.563
0.478
...
% The code
clear
clc
%% Input
x1=0.1.*rand(10,2)+0.6;
x2=0.1.*rand(10,2);
x31=0.1.*rand(10,1);
x32=0.1.*rand(10,1)+0.6;
x3=[x31,x32];
x41=0.1.*rand(10,1)+0.6;
x42=0.1.*rand(10,1);
x4=[x41,x42];
x0=-ones(40,1); % negative bias for every Input
in=[x1;x2;x3;x4];
in=[x0,in]'; % complete Inputmatrix
clear x1
clear x2
clear x0
o1=ones(10,1);
o2=zeros(10,1);
output=[o1;o1;o2;o2]'; % correct Output
clear o2
clear o1
weights=2*rand(3,3)-1; % randomn values for the weight matrix
iterations=100000;
coeff=0.9;
%% Training
for i=1:iterations
out=zeros(40,1);
numIn=length(in);
e=0;
for j=1:numIn
input=in(:,j);
desired_out=output(:,j);
% Forward-Propagation
% Hidden_Layer
H1=dot(input,weights(:,1)); % Hidden-Layer_Neuron1
H2=dot(input,weights(:,2)); % Hidden-Layer_Neuron2
HL(1)=sigmoid(H1); % Acitvation function (first Input for the Output_Layer)
HL(2)=sigmoid(H2); % Activation function (second Input for the Output_Layer)
% Output-Layer
input_OL=[input(1);HL(1);HL(2)]; % Input vector for the Ouput-Layer (Bias and the output from the Hidden-Layer)
O3=dot(input_OL,weights(:,3)); % Output-Layer
OL=sigmoid(O3); % Output value neuronal network
out(j)=OL;
% Backward-Propagation
% Output-Layer
Hout=[-1,HL]'; % Output Hiddenlayer
delta=desired_out-OL; % Deviation Errorfunction MSE 0.5*(delta)^2
dsigmoid_OL=OL*(1-OL); % Deviation Sigmoid-Function (Activationfunction)
% dO3/dw=H0, Deviation weighted sum regarding the weights
weights(:,3)=weights(:,3)+coeff*delta*dsigmoid_OL.*Hout;
% Hidden-Layer
dsigmoid_HL=Hout(2:3).*(1-Hout(2:3));
% weigths(input->hidden)+coeff*der(err.fct.)*der(act.fct_Outputlayer)*Input Outputlayer*weigths(hidden->outputlayer)*der(act.fct._Hiddenlayer)*Input
weights(:,1)=weights(:,1)+coeff*delta*dsigmoid_OL*HL(1)*weights(2,3)*dsigmoid_HL(1).*input; % Weights 1. Neuron in Hidden-Layer
weights(:,2)=weights(:,2)+coeff*delta*dsigmoid_OL*HL(2)*weights(3,3)*dsigmoid_HL(2).*input; % Weights 2. Neuron in Hidden-Layer
e=e+abs(delta);
end
end
weights
out
e
%% Plot
plotpv(in(2:3,:),output)
%% Sigmoid
function s=sigmoid(x)
s=1/(1+exp(-x));
end
The adjusted weigths:
weights =
1.8707 2.2326 -10.9875
-9.5857 9.4768 -18.1220
9.0824 -8.8898 -18.2335
The error:
e =
0.0546
This is the architecture of my neural network:
0 Comments
Answers (0)
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!