newff Create a feed-forward backpropagation network. - Obsoleted in R2010b NNET 7.0.

443 views (last 30 days)
I have this documentation... However since newff is obsoleted I was wondering how I can create the example feedforward neural network in the documentation with the new commands?
To create a feedforward backpropagation network we can use NEWFF
net = newff(PR,[S1 S2...SNl],{TF1 TF2...TFNl},BTF,BLF,PF)
NEWFF(PR,[S1 S2...SNl],{TF1 TF2...TFNl},BTF,BLF,PF) takes,
PR - Rx2 matrix of min and max values for R input elements.
Si - Size of ith layer, for Nl layers.
TFi - Transfer function of ith layer, default = 'tansig'.
BTF - Backprop network training function, default = 'trainlm'.
BLF - Backprop weight/bias learning function, default = 'learngdm'.
PF - Performance function, default = 'mse'.
and returns an N layer feed-forward backprop network.
Consider this set of data:
p=[-1 -1 2 2;0 5 0 5]
t =[-1 -1 1 1]
where p is input vector and t is target.
Suppose we want to create feed forward neural net with one hidden layer, 3 nodes in hidden layer, with tangent sigmoid as transfer function in hidden layer and linear function for output layer, and with gradient descent with momentum backpropagation training function, just simply use the following commands:
» net=newff([-1 2;0 5],[3 1],{'tansig' 'purelin'},traingdm);
Note that the first input [-1 2;0 5] is the minimum and maximum values of vector p. We might use minmax(p) , especially for large data set, then the command becomes:
»net=newff(minmax(p),[3 1],{'tansig' 'purelin'},traingdm);

Accepted Answer

Greg Heath
Greg Heath on 3 Jun 2013
There are two obsolete versions of newff. Using defaults for an I-H-O MLP,
net1 = newff(minmax(p),[ H O ]); %very obsolete
net2 = newff( p, t, H ); % obsolete
The latter version has the additional defaults of
1. Removal of constant (zero variance) rows
2. Mapminmax [-1,1] normalization of input and target
3. 70/15/15 train/val/test data division
It is not clear what versions of MATLAB and NNToolbox you have.
However, both obsolete versions should run. In addition, newfit and newpr are versions of the latter newff that are specialized for regression/curve-fitting and classification/pattern-recognition, respectively.
You can find the documentation for old and new functions in the MATLAB website. However, the commands help, doc, and type will yield most of the information you need to know for most functions on your machine.
Hope this helps.
Thank you for formally accepting my answer

Sign in to comment.

More Answers (4)

ben mus
ben mus on 25 Jan 2017
Edited: ben mus on 25 Jan 2017
Hi, I'm a beginner in Matlab and I work in neural networks The question that poses is: How to know the number of imput layer and the number of hidden layer and the number of exit layer
net=newff(pr,[100 1],{'logsig' 'purelin'}, 'traingda', 'learngdm')
thank you.
  1 Comment
Greg Heath
Greg Heath on 25 Jan 2017
This is not an ANSWER.
Begin a new thread. However, before doing so, run the code in the documentation found at
help newff
doc newff
Other data for additional testing can be obtained from
help nndatasets
doc nndatasets
You can find previous examples by searching both
help greg newff
Hope this helps

Sign in to comment.

raghad ali
raghad ali on 15 Feb 2017
hello >> i'm trying to create a ANN ,but i've error net=newff(minmax(x),[10,1],{'tansig','pureline'} 'traingda', 'learngdm'); net=newff(minmax(x),[10,1],{'tansig','pureline'} 'traingda', 'learngdm');%% that's from matlab what this error .. how can i fixed please ?
  1 Comment
Greg Heath
Greg Heath on 28 Apr 2017
1. Correct spelling of purelin
2. Better yet, use as many defaults as possible by starting with the code in the help and doc documentation.
help newff
doc newff
Hope this helps.

Sign in to comment.

sehrish shah
sehrish shah on 7 Sep 2018
how we create feedforward network in MATLAB for the training for kdd data set using Gradient Descent algorithm?

PIRC on 17 Jul 2020

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!