Back propagation neural network
7 views (last 30 days)
Show older comments
I learnt that the output of activation functions, logsig and tansig, returns the value between [0 1] and [-1 1] respectively. What will happen if the target values are beyond these limits?
2 Comments
Mohammad Sami
on 8 Jun 2020
One of the reasons is that larger values can result in a problem of exploding gradients, when training the network.
Answers (0)
See Also
Categories
Find more on Define Shallow Neural Network Architectures in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!