Main Content

Custom Training Loops

Customize deep learning training loops and loss functions

If the trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For models that cannot be specified as networks of layers, you can define the model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Functions

expand all

dlnetworkDeep learning neural network
imagePretrainedNetworkPretrained neural network for images (Since R2024a)
resnetNetwork2-D residual neural network (Since R2024a)
resnet3dNetwork3-D residual neural network (Since R2024a)
addLayersAdd layers to neural network
removeLayersRemove layers from neural network
replaceLayerReplace layer in neural network
connectLayersConnect layers in neural network
disconnectLayersDisconnect layers in neural network
addInputLayerAdd input layer to network (Since R2022b)
initializeInitialize learnable and state parameters of a dlnetwork (Since R2021a)
networkDataLayoutDeep learning network data layout for learnable parameter initialization (Since R2022b)
setL2FactorSet L2 regularization factor of layer learnable parameter
getL2FactorGet L2 regularization factor of layer learnable parameter
setLearnRateFactorSet learn rate factor of layer learnable parameter
getLearnRateFactorGet learn rate factor of layer learnable parameter
plotPlot neural network architecture
summaryPrint network summary (Since R2022b)
analyzeNetworkAnalyze deep learning network architecture
checkLayerCheck validity of custom or function layer
isequalCheck equality of neural networks (Since R2021a)
isequalnCheck equality of neural networks ignoring NaN values (Since R2021a)
forwardCompute deep learning network output for training
predictCompute deep learning network output for inference
adamupdateUpdate parameters using adaptive moment estimation (Adam)
rmspropupdate Update parameters using root mean squared propagation (RMSProp)
sgdmupdate Update parameters using stochastic gradient descent with momentum (SGDM)
lbfgsupdateUpdate parameters using limited-memory BFGS (L-BFGS) (Since R2023a)
lbfgsStateState of limited-memory BFGS (L-BFGS) solver (Since R2023a)
dlupdate Update parameters using custom function
trainingProgressMonitorMonitor and plot training progress for deep learning custom training loops (Since R2022b)
updateInfoUpdate information values for custom training loops (Since R2022b)
recordMetricsRecord metric values for custom training loops (Since R2022b)
groupSubPlotGroup metrics in training plot (Since R2022b)
deep.gpu.deterministicAlgorithmsSet determinism of deep learning operations on the GPU to get reproducible results (Since R2024b)
padsequencesPad or truncate sequence data to same length (Since R2021a)
minibatchqueueCreate mini-batches for deep learning (Since R2020b)
onehotencodeEncode data labels into one-hot vectors (Since R2020b)
onehotdecodeDecode probability vectors into class labels (Since R2020b)
nextObtain next mini-batch of data from minibatchqueue (Since R2020b)
resetReset minibatchqueue to start of data (Since R2020b)
shuffleShuffle data in minibatchqueue (Since R2020b)
hasdataDetermine if minibatchqueue can return mini-batch (Since R2020b)
partitionPartition minibatchqueue (Since R2020b)
dlarrayDeep learning array for customization
dlgradientCompute gradients for custom training loops using automatic differentiation
dljacobianJacobian matrix deep learning operation (Since R2024b)
dldivergenceDivergence of deep learning data (Since R2024b)
dllaplacianLaplacian of deep learning data (Since R2024b)
dlfevalEvaluate deep learning model for custom training loops
dimsData format of dlarray object
finddimFind dimensions with specified label
stripdimsRemove dlarray data format
extractdataExtract data from dlarray
isdlarrayCheck if object is dlarray (Since R2020b)
crossentropyCross-entropy loss for classification tasks
indexcrossentropyIndex cross-entropy loss for classification tasks (Since R2024b)
l1lossL1 loss for regression tasks (Since R2021b)
l2lossL2 loss for regression tasks (Since R2021b)
huberHuber loss for regression tasks (Since R2021a)
ctcConnectionist temporal classification (CTC) loss for unaligned sequence classification (Since R2021a)
mseHalf mean squared error
dlaccelerateAccelerate deep learning function for custom training loops (Since R2021a)
AcceleratedFunctionAccelerated deep learning function (Since R2021a)
clearCacheClear accelerated deep learning function trace cache (Since R2021a)

Topics

Custom Training Loops

Automatic Differentiation

Deep Learning Function Acceleration

Related Information

Featured Examples