How do I train only one layer of a neural network?

15 views (last 30 days)
I have a pre trained neural network, and I want to apply a temperature scaling calibration to it.
This means that I want to add one layer in front of the softmax layer, and train only the weights of this layer on the validation set.
However, I cannot seem to edit the network such that only this layer 'trains'. I have tried setting all the learning rates to zero, but even then some things change. Like; Batch Normalization layers and the input layers still update.
I even tried setting the overall learnrate to some value very small (1e-12) and the learn rate of my layer to something very large (1e8) but this does not work either. Other parts of the model still change.
This should be a relatively easy task, but I cannot seem to find a way to solve it. Any tips/ideas?

Answers (1)

Shreyansh Mehra
Shreyansh Mehra on 20 Sep 2022
Hello,
The MATLAB Answers link given below explains that to freeze some weights of a neural network, learning parameters 'BiasLearnRateFactor' and/or 'WeightLearnRateFactor' can be set to zero, and might be of help. Please have a look.

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Products


Release

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!