Main Content

Build Deep Neural Networks

Build networks using command-line functions or interactively using the Deep Network Designer app

Build networks from scratch using MATLAB® code or interactively using the Deep Network Designer app. Use built-in layers to construct networks for tasks such as classification and regression. To see a list of built-in layers, see List of Deep Learning Layers. You can then analyze your network to understand the network architecture and check for problems before training.

If the built-in layers do not provide the layer that you need for your task, then you can define your own custom deep learning layer. You can define custom layers with learnable and state parameters. After you define a custom layer, you can check that the layer is valid, GPU compatible, and outputs correctly defined gradients.

For models that cannot be specified as networks of layers, you can define the model as a function. For an example showing how to train a deep learning model defined as a function, see Train Network Using Model Function.


Deep Network DesignerDesign and visualize deep learning networks


expand all

Input Layers

inputLayerInput layer (Since R2023b)
imageInputLayerImage input layer
image3dInputLayer3-D image input layer
sequenceInputLayerSequence input layer
featureInputLayerFeature input layer (Since R2020b)

Convolution and Fully Connected Layers

convolution2dLayer2-D convolutional layer
convolution3dLayer3-D convolutional layer
groupedConvolution2dLayer2-D grouped convolutional layer
transposedConv2dLayerTransposed 2-D convolution layer
transposedConv3dLayerTransposed 3-D convolution layer
fullyConnectedLayerFully connected layer

Recurrent Layers

lstmLayerLong short-term memory (LSTM) layer for recurrent neural network (RNN)
bilstmLayerBidirectional long short-term memory (BiLSTM) layer for recurrent neural network (RNN)
gruLayerGated recurrent unit (GRU) layer for recurrent neural network (RNN) (Since R2020a)
lstmProjectedLayerLong short-term memory (LSTM) projected layer for recurrent neural network (RNN) (Since R2022b)
gruProjectedLayerGated recurrent unit (GRU) projected layer for recurrent neural network (RNN) (Since R2023b)

Transformer Layers

selfAttentionLayerSelf-attention layer (Since R2023a)
attentionLayerDot-product attention layer (Since R2024a)
positionEmbeddingLayerPosition embedding layer (Since R2023b)
sinusoidalPositionEncodingLayerSinusoidal position encoding layer (Since R2023b)
embeddingConcatenationLayerEmbedding concatenation layer (Since R2023b)
indexing1dLayer1-D indexing layer (Since R2023b)

Neural ODE Layers

neuralODELayerNeural ODE layer (Since R2023b)

Activation Layers

reluLayerRectified Linear Unit (ReLU) layer
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
preluLayerParametrized Rectified Linear Unit (PReLU) layer (Since R2024a)
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer
tanhLayerHyperbolic tangent (tanh) layer
swishLayerSwish layer (Since R2021a)
geluLayerGaussian error linear unit (GELU) layer (Since R2022b)
softmaxLayerSoftmax layer
sigmoidLayerSigmoid layer (Since R2020b)
functionLayerFunction layer (Since R2021b)

Normalization Layers

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer (Since R2020b)
instanceNormalizationLayerInstance normalization layer (Since R2021a)
layerNormalizationLayerLayer normalization layer (Since R2021a)
crossChannelNormalizationLayer Channel-wise local response normalization layer

Utility Layers

dropoutLayerDropout layer
spatialDropoutLayerSpatial dropout layer (Since R2024a)
flattenLayerFlatten layer
crop2dLayer2-D crop layer
crop3dLayer3-D crop layer (Since R2019b)
networkLayerNetwork Layer (Since R2024a)

Pooling and Unpooling Layers

averagePooling2dLayerAverage pooling layer
averagePooling3dLayer3-D average pooling layer
adaptiveAveragePooling2dLayerAdaptive average pooling 2-D layer (Since R2024a)
globalAveragePooling2dLayer2-D global average pooling layer (Since R2019b)
globalAveragePooling3dLayer3-D global average pooling layer (Since R2019b)
globalMaxPooling2dLayerGlobal max pooling layer (Since R2020a)
globalMaxPooling3dLayer3-D global max pooling layer (Since R2020a)
maxPooling2dLayerMax pooling layer
maxPooling3dLayer3-D max pooling layer
maxUnpooling2dLayerMax unpooling layer

Combination Layers

additionLayerAddition layer
multiplicationLayerMultiplication layer (Since R2020b)
concatenationLayerConcatenation layer
depthConcatenationLayerDepth concatenation layer
dlnetworkDeep learning neural network (Since R2019b)
imagePretrainedNetworkPretrained neural network for images (Since R2024a)
resnetNetwork2-D residual neural network (Since R2024a)
resnet3dNetwork3-D residual neural network (Since R2024a)
addLayersAdd layers to neural network
removeLayersRemove layers from neural network
replaceLayerReplace layer in neural network
getLayerLook up a layer by name or path (Since R2024a)
connectLayersConnect layers in neural network
disconnectLayersDisconnect layers in neural network
expandLayersExpand network layers (Since R2024a)
groupLayersGroup layers into network layers (Since R2024a)
analyzeNetworkAnalyze deep learning network architecture
addInputLayerAdd input layer to network (Since R2022b)
initializeInitialize learnable and state parameters of a dlnetwork (Since R2021a)
networkDataLayoutDeep learning network data layout for learnable parameter initialization (Since R2022b)
setL2FactorSet L2 regularization factor of layer learnable parameter
getL2FactorGet L2 regularization factor of layer learnable parameter
setLearnRateFactorSet learn rate factor of layer learnable parameter
getLearnRateFactorGet learn rate factor of layer learnable parameter
dag2dlnetworkConvert SeriesNetwork and DAGNetwork to dlnetwork (Since R2024a)
plotPlot neural network architecture
summaryPrint network summary (Since R2022b)
analyzeNetworkAnalyze deep learning network architecture
checkLayerCheck validity of custom or function layer
isequalCheck equality of neural networks (Since R2021a)
isequalnCheck equality of neural networks ignoring NaN values (Since R2021a)


Built-In Layers

Custom Layers