Feedforward net - how to use LeakyReLU or scaled exponential linear unit for the hidden layers?
4 views (last 30 days)
Show older comments
In a multi-layer shallow network using feedforwardnet, how to use different activation functions like Leaky ReLU or Scaled exponential linear unit in the hidden layers? The default function supported seem to be only tansig for the hidden layers.
2 Comments
Ihsan Ullah
on 3 Apr 2019
Did you get an answer to your question? If you have sorted out this, would you please write the code in the comment section?
Thank you
Answers (0)
See Also
Categories
Find more on Modeling and Prediction with NARX and Time-Delay Networks in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!