How to connect a 1d convolution layer after a LSTM layer with output mode "last"?
4 views (last 30 days)
Show older comments
I am working on classify sEMG data, and I want to build a LSTM-CNN network. I want to first employ a LSTM layer and take the last output as the input of a 1D convolution layer.
layers = [ ...
sequenceInputLayer(sEMG_channels)
lstmLayer(numHiddenUnits,'OutputMode','sequence')
lstmLayer(numHiddenUnits,'OutputMode','last')
convolution1dLayer(3,25,'Padding','same')
batchNormalizationLayer
reluLayer
dropoutLayer(0.4)
convolution1dLayer(3,10,'Padding','same')
batchNormalizationLayer
reluLayer
fullyConnectedLayer(23)
softmaxLayer
classificationLayer];
However, when I run it. It returns that the convolution1dLayer has 0 temporal dimension and 0 spatial dimension.
my input data XTrain is a 1000*1 cell and in each cell is a 48*30000 double matrix which means that my sEMG data has 48 channels and 30000 time points.
What is the output of lstmLayer(numHiddenUnits,'OutputMode','last') like?
how can i connect LSTM with CNN?
3 Comments
Manikanta Aditya
on 8 Apr 2024
layers = [ ...
sequenceInputLayer(48) % Assuming sEMG_channels = 48
lstmLayer(numHiddenUnits, 'OutputMode', 'last')
reshapeLayer([1 1 numHiddenUnits],'Name','reshape') % Reshape LSTM output to have a spatial dimension
convolution1dLayer(3, 25, 'Padding', 'same')
batchNormalizationLayer
reluLayer
dropoutLayer(0.4)
convolution1dLayer(3, 10, 'Padding', 'same')
batchNormalizationLayer
reluLayer
fullyConnectedLayer(23)
softmaxLayer
classificationLayer];
Answers (0)
See Also
Categories
Find more on AI for Signals in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!