NN accuracy on test set low
Show older comments
I have implemented a neural network in Matlab R2013a for character recognition. I have used trainbr function for nn training. 80% samples were used for training and the rest for testing. When i plot the confusion matrix, i get 100% accuracy on the training set. But for the test set the accuracy is very low(around 60%). What could be possibly wrong?
Accepted Answer
More Answers (3)
Greg Heath
on 13 Mar 2014
2 votes
Insufficient info:
How many characters?
How many examples for each character?
What are the dimensions of the input and target matrices?
Are the summary statistics of the training and test subsets sufficiently similar?
How many input, hidden and output nodes?
What values of hidden nodes did you try ?
How many random weight initializations for each value ?
Although trainbr should mitigate the effect of using more hidden nodes than are needed, you still need many trials to establish sufficient confidence intervals.
Hope this helps.
Thank you for formally accepting my answer
Greg
6 Comments
Anitha
on 13 Mar 2014
Greg Heath
on 14 Mar 2014
Sorry, that does not make sense to me. Consider the following
13 characters of the alphabet A-to-M
234 examples, 18 for each character
All characters are columnized 8x5 images
size(input) = [ 40 234]
size(target) = [ 13 234] % columns of eye(13)
Where did I go wrong?
Anitha
on 15 Mar 2014
Greg Heath
on 15 Mar 2014
[ I N ] = [ 18 234]
[ O N ] = [ 13 234]
The default trn/val/tst split for trainbr is 0.8/0.0/0.2. The resulting number of training equations is
Ntrn = N - round(0.2*N) % 187
Ntrneq = Ntrn*O % 2431
With H=30 hidden nodes, the number of unknown weights is
Nw = (I+1)*H+(H+1)*O % 973
The ratio is
r = Ntrneq/Nw % ~2.5
Which should be ok for trainbr.
I suggest making multiple designs (20?) in a loop with different mixes of training examples, testing examples and initial weights. For examples of multiple designs in a loop, search using
greg Ntrials
Post your code if you still have problems.
Greg Heath
on 19 Mar 2014
Two mistakes
1. No configure statement in the loop
2. Used net instead of bestnet in the last train statement
Greg Heath
on 16 Mar 2014
1. Not necessary to specify default process functions.
2. How did you know my birthdate is 4151941 ??
3. You are reusing the same net for each trial without using CONFIGURE.
Therefore, the initial weights of each trial are the final weights of the last trial.
I suspect that if the design results are not monotonically better it is because TRAIN is
using a new trn/tst division.
4. Use configure after the RNG initialization.
5. An alternate approach is to CONTINUALLY save one or all of
a. the best current RNG state
a. the best current net
b. the best current Wb = getwb(net)
6. I think you should do all three at the same time and compare results
Hope this helps.
Thank you for formally accepting my answer
Greg
2 Comments
Anitha
on 17 Mar 2014
Greg Heath
on 19 Mar 2014
The second train statement contains net instead of bestnet
Anitha
on 19 Mar 2014
0 votes
1 Comment
Greg Heath
on 20 Mar 2014
See my second post in the MATLAB Central thread.
Categories
Find more on Modeling and Prediction with NARX and Time-Delay Networks in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!