How can I turn off the activation of a single unit in deep neural networks?

1 view (last 30 days)
In the Deep Learning Toolbox, how can I turn off the activation of a single unit in one layer and then calculate the activation of the whole network?
As an indirect method, I think that it can be done if I set all the convolutional weights between the unit and the next layer to be 0.
However, I wonder if there is any function developed for this ablation test.

Answers (1)

Mahesh Taparia
Mahesh Taparia on 13 Jul 2020
Hi
You can create a custom activation function in MATLAB and can keep the particular unit as same and for the rest of the nodes, you can apply any function. You can refer here for more information on it. Change the predict function in that class according to your requirement. Hope it will help!

Products


Release

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!