Batch normalization layer
A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers.
After normalization, the layer scales the input with a learnable scale factor γ and shifts by a learnable offset β.
creates a batch normalization layer.layer
= batchNormalizationLayer
creates a batch normalization layer and sets the optional layer
= batchNormalizationLayer(Name,Value
)TrainedMean
, TrainedVariance
, Epsilon
, Parameters and Initialization, Learn Rate and Regularization, and
Name
properties using one or more name-value pairs.
For example, batchNormalizationLayer('Name','batchnorm')
creates a batch normalization layer with the name
'batchnorm'
.
The batch normalization operation normalizes the elements xi of the input by first calculating the mean μB and variance σB2 over the spatial, time, and observation dimensions for each channel independently. Then, it calculates the normalized activations as
where ϵ is a constant that improves numerical stability when the variance is very small.
To allow for the possibility that inputs with zero mean and unit variance are not optimal for the operations that follow batch normalization, the batch normalization operation further shifts and scales the activations using the transformation
where the offset β and scale factor γ are learnable parameters that are updated during network training.
To make predictions with the network after training, batch normalization requires a fixed mean and variance to normalize the data. This fixed mean and variance can be calculated from the training data after training, or approximated during training using running statistic computations.
If the 'BatchNormalizationStatistics'
training option is 'moving'
, then the software approximates the batch normalization statistics during training using a running estimate and, after training, sets the TrainedMean
and TrainedVariance
properties to the latest values of the moving estimates of the mean and variance, respectively.
If the 'BatchNormalizationStatistics'
training option is 'population'
, then after network training finishes, the software passes through the data once more and sets the TrainedMean
and TrainedVariance
properties to mean and variance computed from the entire training data set, respectively.
The layer uses the TrainedMean
and TrainedVariance
to normalize the input during prediction.
[1] Ioffe, Sergey, and Christian Szegedy. "Batch normalization: Accelerating deep network training by reducing internal covariate shift." preprint, arXiv:1502.03167 (2015).
convolution2dLayer
| fullyConnectedLayer
| groupNormalizationLayer
| layerNormalizationLayer
| reluLayer
| trainingOptions
| trainNetwork