Multivariate Regression (in time and features) Using LSTM
8 views (last 30 days)
Trying to feed a LSTM with different streamflow time series and their delayed sequences for gap filling. Let x be the initial matrix with selected predictors, one per line, considering size(x,2) as the number of samples. To introduce time dependence, the predictors are alternated with their delayed versions (from dt= [1:ndt], ndt being the maximum delay considered) as below:
with the respective LSTM:
numFeatures = size(xTrain,1);
numResponses = size(yTrain,1);
numHiddenUnits = 300;
layers = [ ...
The target is a line vector y. Is there a more effective arrange to introduce time dependencies in LSTM? I mean, I have tried to associate every y instance with a 3D matrix x2 containning the values of x (not of x1) from (t-ndt) to (t):
But I don't know how to addapt the respectve LSTM.
I know the "Sequence-to-Sequence Using Deep-Learning example
I does not include explicit time dependencies.