How many LSTM blocks are there in bidirectional LSTM layers?
3 views (last 30 days)
Show older comments
Hi,
How can i relate hidden layers with number of lstm blocks?
inputSize = 17
numHiddenUnits = 50;
numClasses = 2;
maxEpochs = 15;
miniBatchSize = 1;
layers = [ ...
sequenceInputLayer(inputSize)
bilstmLayer(numHiddenUnits,'OutputMode','last')
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer]
options = trainingOptions('adam', ...
'ExecutionEnvironment','auto', ...
'GradientThreshold',1, ...
'MaxEpochs',maxEpochs, ...
'MiniBatchSize',miniBatchSize, ...
'SequenceLength','longest', ...
'Shuffle','never', ...
'Verbose',0, ...
'Plots','training-progress');
0 Comments
Answers (1)
Shantanu Dixit
on 20 Jun 2023
Hi Shweta,
Assuming that by hidden layers you mean numHiddenUnits, the numHiddenUnits refer to the number of LSTM blocks per direction (same for both LSTM and BiLSTM). So here the number of LSTM blocks are 50 (numHiddenUnits = 50).
Refer the documentation for lstm and bilstm layer:
0 Comments
See Also
Categories
Find more on Define Shallow Neural Network Architectures in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!