LSTM multi-input time series
4 views (last 30 days)
Show older comments
Hello, I have got a problem in using LSTM for a multi-input single-output (MISO) regression which is I do not know how I can use it. I have four input time series, each has 720 samples. Here several ways I tried and the error I get:
1. I tried the default LSTM regression of Matlab R2018a but the outputs are all equal!! 2. I tried as default LSTM for sequence regression by changing the time series in cells with four features and 720 time steps but I get the following error:
*Error using trainNetwork (line 154) Invalid training data. Predictors must be a N-by-1 cell array of sequences, where N is the number of sequences. All sequences must have the same feature dimension and at least one time step.
Error in Untitled (line 55) net = trainNetwork(XTrain,YTrain,layers,options);
Caused by: Error using nnet.internal.cnn.util.NetworkDataValidator/assertValidSequenceInput (line 339) Invalid training data. Predictors must be a N-by-1 cell array of sequences, where N is the number of sequences. All sequences must have the same feature dimension and at least one time step.*
Any recommendation please?
1 Comment
Answers (2)
J R
on 6 Jul 2018
Edited: J R
on 6 Jul 2018
It looks like your first attempt was OK you just needed to change some parameters. About you 2nd attempt it Seems like there is a problem in the dimensions in one of your layer. In order for me to be able to help you I need to see the code.
Supervised Neural networks are designed to minimize the error. The are doing this by finding a local minimum using gradients. The problem is that local minimums are generally not the global minimum. So your network can get stuck in a local minimum that has a very bad fitting to the expected results. It's probably because the default per-training weights and biases are too close to a bad local minimum. I think You should go back to the network design that gave you bad fitting You can fix this by trying different customized weights and biases before the training in order to get away from the bad minimum point.
for example : layers = [ ...
sequenceInputLayer(n)
fullyConnectedLayer(n)
lstmLayer(n2,'OutputMode','sequence')
lstmLayer(n2,'OutputMode','sequence')
fullyConnectedLayer(1)
regressionLayer];
layers(2).Weights = randn([ n n])*5;
layers(2).Bias = randn([ n 1 ])*5+ 0.5;
layers(5).Weights = randn([1 n2]) *5;
layers(5).Bias = randn([1 1 ])*5+ 0.5;
you should also try and change the training algorithm. using trainingOptions:
options = trainingOptions('sgdm'<--------- the training algorithm, ...);
0 Comments
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!