Main Content

setLearnRateFactor

Set learn rate factor of layer learnable parameter

Description

example

layerUpdated = setLearnRateFactor(layer,parameterName,factor) sets the learn rate factor of the parameter with the name parameterName in layer to factor.

For built-in layers, you can set the learn rate factor directly by using the corresponding property. For example, for a convolution2dLayer layer, the syntax layer = setLearnRateFactor(layer,'Weights',factor) is equivalent to layer.WeightLearnRateFactor = factor.

example

layerUpdated = setLearnRateFactor(layer,parameterPath,factor) sets the learn rate factor of the parameter specified by the path parameterPath. Use this syntax when the parameter is in a dlnetwork object in a custom layer.

example

dlnetUpdated = setLearnRateFactor(dlnet,layerName,parameterName,factor) sets the learn rate factor of the parameter with the name parameterName in the layer with name layerName for the specified dlnetwork object.

example

dlnetUpdated = setLearnRateFactor(dlnet,parameterPath,factor) sets the learn rate factor of the parameter specifiedby the path parameterPath. Use this syntax when the parameter is in a nested layer.

Examples

collapse all

Set and get the learning rate factor of a learnable parameter of a custom PReLU layer.

Define a custom PReLU layer. To create this layer, save the file preluLayer.m in the current folder.

Create a layer array including the custom layer preluLayer.

layers = [ ...
    imageInputLayer([28 28 1])
    convolution2dLayer(5,20)
    batchNormalizationLayer
    preluLayer(20,'prelu')
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer];

Set the learn rate factor of the 'Alpha' learnable parameter of the preluLayer to 2.

layers(4) = setLearnRateFactor(layers(4),'Alpha',2);

View the updated learn rate factor.

factor = getLearnRateFactor(layers(4),'Alpha')
factor = 2

Set and get the learning rate factor of a learnable parameter of a nested layer.

Create a residual block layer using the custom layer residualBlockLayer attached to this example as a supporting file. To access this file, open this example as a Live Script.

inputSize = [224 224 64];
numFilters = 64;
layer = residualBlockLayer(inputSize,numFilters)
layer = 
  residualBlockLayer with properties:

       Name: ''

   Learnable Parameters
    Network: [1x1 dlnetwork]

  Show all properties

View the layers of the nested network.

layer.Network.Layers
ans = 
  8x1 Layer array with layers:

     1   'in'      Image Input           224x224x64 images
     2   'conv1'   Convolution           64 3x3x64 convolutions with stride [1  1] and padding 'same'
     3   'gn1'     Group Normalization   Group normalization with 64 channels split into 1 groups
     4   'relu1'   ReLU                  ReLU
     5   'conv2'   Convolution           64 3x3x64 convolutions with stride [1  1] and padding 'same'
     6   'gn2'     Group Normalization   Group normalization with 64 channels split into 64 groups
     7   'add'     Addition              Element-wise addition of 2 inputs
     8   'relu2'   ReLU                  ReLU

Set the learning rate factor of the learnable parameter 'Weights' of the layer 'conv1' to 2 using the setLearnRateFactor function.

factor = 2;
layer = setLearnRateFactor(layer,'Network/conv1/Weights',factor);

Get the updated learning rate factor using the getLearnRateFactor function.

factor = getLearnRateFactor(layer,'Network/conv1/Weights')
factor = 2

Set and get the learning rate factor of a learnable parameter of a dlnetwork object.

Create a dlnetwork object.

layers = [
    imageInputLayer([28 28 1],'Normalization','none','Name','in')
    convolution2dLayer(5,20,'Name','conv')
    batchNormalizationLayer('Name','bn')
    reluLayer('Name','relu')
    fullyConnectedLayer(10,'Name','fc')
    softmaxLayer('Name','sm')];

lgraph = layerGraph(layers);

dlnet = dlnetwork(lgraph);

Set the learn rate factor of the 'Weights' learnable parameter of the convolution layer to 2 using the setLearnRateFactor function.

factor = 2;
dlnet = setLearnRateFactor(dlnet,'conv','Weights',factor);

Get the updated learn rate factor using the getLearnRateFactor function.

factor = getLearnRateFactor(dlnet,'conv','Weights')
factor = 2

Set and get the learning rate factor of a learnable parameter of a nested layer in a dlnetwork object.

Create a dlnetwork object containing the custom layer residualBlockLayer attached to this example as a supporting file. To access this file, open this example as a Live Script.

inputSize = [224 224 3];
numFilters = 32;
numClasses = 5;

layers = [
    imageInputLayer(inputSize,'Normalization','none','Name','in')
    convolution2dLayer(7,numFilters,'Stride',2,'Padding','same','Name','conv')
    groupNormalizationLayer('all-channels','Name','gn')
    reluLayer('Name','relu')
    maxPooling2dLayer(3,'Stride',2,'Name','max')
    residualBlockLayer([56 56 numFilters],numFilters,'Name','res1')
    residualBlockLayer([56 56 numFilters],numFilters,'Name','res2')
    residualBlockLayer([56 56 numFilters],2*numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res3')
    residualBlockLayer([28 28 2*numFilters],2*numFilters,'Name','res4')
    residualBlockLayer([28 28 2*numFilters],4*numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res5')
    residualBlockLayer([14 14 4*numFilters],4*numFilters,'Name','res6')
    globalAveragePooling2dLayer('Name','gap')
    fullyConnectedLayer(numClasses,'Name','fc')
    softmaxLayer('Name','sm')];

lgraph = layerGraph(layers);
dlnet = dlnetwork(lgraph);

View the layers of the nested network in the layer 'res1'.

dlnet.Layers(6).Network.Layers
ans = 
  8x1 Layer array with layers:

     1   'in'      Image Input           56x56x32 images
     2   'conv1'   Convolution           32 3x3x32 convolutions with stride [1  1] and padding 'same'
     3   'gn1'     Group Normalization   Group normalization with 32 channels split into 1 groups
     4   'relu1'   ReLU                  ReLU
     5   'conv2'   Convolution           32 3x3x32 convolutions with stride [1  1] and padding 'same'
     6   'gn2'     Group Normalization   Group normalization with 32 channels split into 32 groups
     7   'add'     Addition              Element-wise addition of 2 inputs
     8   'relu2'   ReLU                  ReLU

Set the learning rate factor of the learnable parameter 'Weights' of the layer 'conv1' to 2 using the setLearnRateFactor function.

factor = 2;
dlnet = setLearnRateFactor(dlnet,'res1/Network/conv1/Weights',factor);

Get the updated learning rate factor using the getLearnRateFactor function.

factor = getLearnRateFactor(dlnet,'res1/Network/conv1/Weights')
factor = 2

Load a pretrained network.

net = squeezenet;

Convert the network to a layer graph, remove the output layer, and convert it to a dlnetwork object.

lgraph = layerGraph(net);
lgraph = removeLayers(lgraph,'ClassificationLayer_predictions');
dlnet = dlnetwork(lgraph);

The Learnables property of the dlnetwork object is a table that contains the learnable parameters of the network. The table includes parameters of nested layers in separate rows. View the first few rows of the learnables table.

learnables = dlnet.Learnables;
head(learnables)
ans=8×3 table
          Layer           Parameter           Value       
    __________________    _________    ___________________

    "conv1"               "Weights"    {3x3x3x64  dlarray}
    "conv1"               "Bias"       {1x1x64    dlarray}
    "fire2-squeeze1x1"    "Weights"    {1x1x64x16 dlarray}
    "fire2-squeeze1x1"    "Bias"       {1x1x16    dlarray}
    "fire2-expand1x1"     "Weights"    {1x1x16x64 dlarray}
    "fire2-expand1x1"     "Bias"       {1x1x64    dlarray}
    "fire2-expand3x3"     "Weights"    {3x3x16x64 dlarray}
    "fire2-expand3x3"     "Bias"       {1x1x64    dlarray}

To freeze the learnable parameters of the network, loop over the learnable parameters and set the learn rate to 0 using the setLearnRateFactor function.

factor = 0;

numLearnables = size(learnables,1);
for i = 1:numLearnables
    layerName = learnables.Layer(i);
    parameterName = learnables.Parameter(i);
    
    dlnet = setLearnRateFactor(dlnet,layerName,parameterName,factor);
end

To use the updated learn rate factors when training, you must pass the dlnetwork object to the update function in the custom training loop. For example, use the command

[dlnet,velocity] = sgdmupdate(dlnet,gradients,velocity);

Input Arguments

collapse all

Input layer, specified as a scalar Layer object.

Parameter name, specified as a character vector or a string scalar.

Learning rate factor for the parameter, specified as a nonnegative scalar.

The software multiplies this factor by the global learning rate to determine the learning rate for the specified parameter. For example, if factor is 2, then the learning rate for the specified parameter is twice the current global learning rate. The software determines the global learning rate based on the settings specified with the trainingOptions function.

Example: 2

Path to parameter in nested layer, specified as a string scalar or a character vector. A nested layer is a custom layer that itself defines a layer graph as a learnable parameter.

If the input to setLearnRateFactor is a nested layer, then the parameter path has the form "propertyName/layerName/parameterName", where:

  • propertyName is the name of the property containing a dlnetwork object

  • layerName is the name of the layer in the dlnetwork object

  • parameterName is the name of the parameter

If there are multiple levels of nested layers, then specify each level using the form "propertyName1/layerName1/.../propertyNameN/layerNameN/parameterName", where propertyName1 and layerName1 correspond to the layer in the input to the setLearnRateFactor function, and the subsequent parts correspond to the deeper levels.

Example: For layer input to setLearnRateFactor, the path "Network/conv1/Weights" specifies the "Weights" parameter of the layer with name "conv1" in the dlnetwork object given by layer.Network.

If the input to setLearnRateFactor is a dlnetwork object and the desired parameter is in a nested layer, then the parameter path has the form "layerName1/propertyName/layerName/parameterName", where:

  • layerName1 is the name of the layer in the input dlnetwork object

  • propertyName is the property of the layer containing a dlnetwork object

  • layerName is the name of the layer in the dlnetwork object

  • parameterName is the name of the parameter

If there are multiple levels of nested layers, then specify each level using the form "layerName1/propertyName1/.../layerNameN/propertyNameN/layerName/parameterName", where layerName1 and propertyName1 correspond to the layer in the input to the setLearnRateFactor function, and the subsequent parts correspond to the deeper levels.

Example: For dlnetwork input to setLearnRateFactor, the path "res1/Network/conv1/Weights" specifies the "Weights" parameter of the layer with name "conv1" in the dlnetwork object given by layer.Network, where layer is the layer with name "res1" in the input network dlnet.

Data Types: char | string

Network for custom training loops, specified as a dlnetwork object.

Layer name, specified as a string scalar or a character vector.

Data Types: char | string

Output Arguments

collapse all

Updated layer, returned as a Layer.

Updated network, returned as a dlnetwork.

Introduced in R2017b