Main Content

Custom Training Loops

Customize deep learning training loops and loss functions for image networks

If the trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For networks that cannot be created using layer graphs, you can define custom networks as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Functions

expand all

dlnetworkDeep learning network for custom training loops
trainingProgressMonitorMonitor and plot training progress for deep learning custom training loops
minibatchqueueCreate mini-batches for deep learning
dlarrayDeep learning array for customization
dlgradientCompute gradients for custom training loops using automatic differentiation
dlfevalEvaluate deep learning model for custom training loops
crossentropyCross-entropy loss for classification tasks
l1lossL1 loss for regression tasks
l2lossL2 loss for regression tasks
huberHuber loss for regression tasks
mseHalf mean squared error
dlconvDeep learning convolution
dltranspconvDeep learning transposed convolution
fullyconnectSum all weighted input data and apply a bias
batchnormNormalize data across all observations for each channel independently
crosschannelnormCross channel square-normalize using local responses
groupnormNormalize data across grouped subsets of channels for each observation independently
instancenormNormalize across each channel for each observation independently
layernormNormalize data across all channels for each observation independently
avgpoolPool data to average values over spatial dimensions
maxpoolPool data to maximum value
maxunpoolUnpool the output of a maximum pooling operation
reluApply rectified linear unit activation
leakyreluApply leaky rectified linear unit activation
geluApply Gaussian error linear unit (GELU) activation
softmaxApply softmax activation to channel dimension
sigmoidApply sigmoid activation

Topics

Custom Training Loops

Automatic Differentiation