Effective number of parameters in a neural network

1 view (last 30 days)
Hello ;
I'm training a neural network using the Bayesian approach. In the documentation, I read the following : "One feature of this algorithm is that it provides a measure of how many network parameters (weights and biases) are being effectively used by the network."
But I don't quite understand something : once I know the amount of effective parameters, what can I do with this information? For starter, how come some of the parameters are not used? Why are some weights inactive? Secondly, can knowing that help me prune the network and reduce the amount of neurons, for example? If yes, how? If no, then what is the practical use of that piece of information?
Thanks in advance for your help!
J

Accepted Answer

Greg Heath
Greg Heath on 19 May 2013
TRAINBR automatically chooses the weighting ratio that multiplies the sum of squared weights that is added to the sum of squared errors to form the objective function. The choice depends on the effective number of weights.
I don't recall the formula, however, you should be able to find it in the source code, it's references, or online.
The only way I can see you using it is if you use TRAINLM with the regularization option of mse. In that case the user chooses the ratio. However, I don't know of a good reason to do that instead of using TRAINBR.
Hope this helps.
Thank you for formally accepting my answer.
Greg

More Answers (0)

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!