Understanding neural networks with patternet
    8 views (last 30 days)
  
       Show older comments
    
    Manolis Michailidis
      
 on 27 May 2016
  
    
    
    
    
    Edited: Manolis Michailidis
      
 on 27 May 2016
            Hello, i am new to neural networks and find it difficult to understand a few things about them. First of all i have created with patternet a network with 4 inputs , 3 hidden layers (4 if we consider the output also) and 3 outputs. I have several questions:
1) Is there some formula to find the nodes of each hidden layer? also can i plot so i can view them?
2) For each hidden layer can i extract vectors/struct with the weights and biases? note that i don't want the final values but instead all the values for each iteration. I have found a previous post saying to create a for loop and for each iteration extract them, but how would i know when will my nn stop, for example sometimes it stops in 30th iteration and sometimes in 50th.
3) For each iteration can i see the regression of mse? i mean to extract the actual numbers for each iteration and not to see the graph only with plotperform.
4) How can i actually save my network so when i open it i will alos have the plots of mse, confusion, plotroc and to view the structure of my network. When i save the object 'net' it saves only the matrix and when i restart my matlab i can't get all the above.
5) The correct approaching is to calculate the number of nodes, weights and biases according to this post? (Calculating biased mse etc) http://www.mathworks.com/matlabcentral/answers/78809-how-to-load-own-data-set-into-neural-network
Here is the network i created

And here is my code what i have done so far:
    net = patternnet(3);%,'trainscg','mse');
    net.trainFcn = 'trainlm';
    net.trainParam.epochs=2*50;  %Maximum number of epochs to train
    net.trainParam.goal=0;  %Performance goal
    net.trainParam.max_fail=150;  %Maximum validation failures
    net.trainParam.min_grad=1e-70;  %Minimum performance gradient
    net.trainParam.mu=1e-4;  %Initial mu
    net.trainParam.mu_dec=0.01;  %mu decrease factor
    net.trainParam.mu_inc=10;  %mu increase factor
    net.trainParam.mu_max=1e10;  %Maximum mu
    net.trainParam.show=25;  %Epochs between displays (NaN for no displays)
    net.trainParam.showCommandLine=true;  %Generate command-line output
    net.trainParam.showWindow=true;  %Show training GUI
    net.trainParam.time=inf;
    net = train(net,X,Target);
Thanks in advance.
1 Comment
  Greg Heath
      
      
 on 27 May 2016
				In ANSWERS thread 78809 the poster had nowhere near enough data to consider a practical NN design ( One 8-dimensional input from each of four classes).
Concentrate on understanding the advice given in my answer below.
The key is to accept as many defaults as possible and concentrate on using no more hidden nodes than you need
More MATLAB example data can be obtained by using the documentation commands
 help nndatasets
 doc nndatasets
However, the SIMPLECLASS & IRIS datasets should be sufficient for now.
Hope this helps.
Greg
Accepted Answer
  Greg Heath
      
      
 on 27 May 2016
        I have contributed to 57 NEWSGROUP threads about PATTERNNET (14 within the past year). Many are of a tutorial nature. I recommend first searching the NEWSGROUP using
 greg patternnet
and concentrate on the most recent 14 posts.
I have also contributed to 381 ANSWERS threads about PATTERNNET (85 within the past year). Most are helping to solve problems encountered by others.
So, dig in. If you have specific questions about any post, please pose your questions in that same thread.
Hope this helps.
Thank you for formally accepting my answer
Greg
1 Comment
More Answers (0)
See Also
Categories
				Find more on Deep Learning Toolbox in Help Center and File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
