Main Content

Deep Network Customization for Images

Customize deep learning training loops and loss functions

If the trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For networks that cannot be created using layer graphs, you can define custom networks as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Functions

expand all

dlnetworkDeep learning network for custom training loops
forwardCompute deep learning network output for training
predictCompute deep learning network output for inference
adamupdateUpdate parameters using adaptive moment estimation (Adam)
rmspropupdate Update parameters using root mean squared propagation (RMSProp)
sgdmupdate Update parameters using stochastic gradient descent with momentum (SGDM)
dlupdate Update parameters using custom function
minibatchqueueCreate mini-batches for deep learning
onehotencodeEncode data labels into one-hot vectors
onehotdecodeDecode probability vectors into class labels
initializeInitialize learnable and state parameters of a dlnetwork
plotPlot neural network architecture
addLayersAdd layers to layer graph or network
removeLayersRemove layers from layer graph or network
connectLayersConnect layers in layer graph or network
disconnectLayersDisconnect layers in layer graph or network
replaceLayerReplace layer in layer graph or network
summaryPrint network summary
trainingProgressMonitorMonitor and plot training progress for deep learning custom training loops
dlarrayDeep learning array for custom training loops
dlgradientCompute gradients for custom training loops using automatic differentiation
dlfevalEvaluate deep learning model for custom training loops

Input Layers

imageInputLayerImage input layer
image3dInputLayer3-D image input layer

Convolution and Fully Connected Layers

convolution2dLayer2-D convolutional layer
convolution3dLayer3-D convolutional layer
groupedConvolution2dLayer2-D grouped convolutional layer
transposedConv2dLayerTransposed 2-D convolution layer
transposedConv3dLayerTransposed 3-D convolution layer
fullyConnectedLayerFully connected layer

Activation Layers

reluLayerRectified Linear Unit (ReLU) layer
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer
tanhLayerHyperbolic tangent (tanh) layer
swishLayerSwish layer
geluLayerGaussian error linear unit (GELU) layer
functionLayerFunction layer

Normalization, Dropout, and Cropping Layers

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer
instanceNormalizationLayerInstance normalization layer
layerNormalizationLayerLayer normalization layer
crossChannelNormalizationLayer Channel-wise local response normalization layer
dropoutLayerDropout layer
crop2dLayer2-D crop layer
crop3dLayer3-D crop layer

Pooling and Unpooling Layers

averagePooling2dLayerAverage pooling layer
averagePooling3dLayer3-D average pooling layer
globalAveragePooling2dLayer2-D global average pooling layer
globalAveragePooling3dLayer3-D global average pooling layer
globalMaxPooling2dLayerGlobal max pooling layer
globalMaxPooling3dLayer3-D global max pooling layer
maxPooling2dLayerMax pooling layer
maxPooling3dLayer3-D max pooling layer
maxUnpooling2dLayerMax unpooling layer

Combination Layers

additionLayerAddition layer
multiplicationLayerMultiplication layer
concatenationLayerConcatenation layer
depthConcatenationLayerDepth concatenation layer

Topics