The basic multivariate time series models based on linear autoregressive, moving average models are:
Model Name  Abbreviation  Equation 

Vector Autoregression  VAR(p) 
$${y}_{t}=c+{\displaystyle \sum _{i=1}^{p}{\Phi}_{i}{y}_{ti}}+{\epsilon}_{t}$$

Vector Moving Average  VMA(q) 
$${y}_{t}=c+{\displaystyle \sum _{j=1}^{q}{\Theta}_{j}{\epsilon}_{tj}}+{\epsilon}_{t}$$

Vector Autoregression Moving Average  VARMA(p, q) 
$${y}_{t}=c+{\displaystyle \sum _{i=1}^{p}{\Phi}_{i}{y}_{ti}}+{\displaystyle \sum _{j=1}^{q}{\Theta}_{j}{\epsilon}_{tj}}+{\epsilon}_{t}$$

Vector Autoregression Moving Average with a linear time trend  VARMA(p, q) 
$${y}_{t}=c+\delta t+{\displaystyle \sum _{i=1}^{p}{\Phi}_{i}{y}_{ti}+{\displaystyle \sum _{j=1}^{q}{\Theta}_{j}{\epsilon}_{tj}+{\epsilon}_{t}}}$$

Vector Autoregression Moving Average with eXogenous inputs  VARMAX(p, q) 
$${y}_{t}=c+\beta {x}_{t}+{\displaystyle \sum _{i=1}^{p}{\Phi}_{i}{y}_{ti}}+{\displaystyle \sum _{j=1}^{q}{\Theta}_{j}{\epsilon}_{tj}}+{\epsilon}_{t}$$

Structural Vector Autoregression Moving Average  SVARMA(p, q) 
$${\Phi}_{0}{y}_{t}=c+{\displaystyle \sum _{i=1}^{p}{\Phi}_{i}{y}_{ti}}+{\displaystyle \sum _{j=1}^{q}{\Theta}_{j}{\epsilon}_{tj}}+{\Theta}_{0}{\epsilon}_{t}$$

The following variables appear in the equations:
y_{t} is the vector of response time series variables at time t. y_{t} has n elements.
c is a constant vector of offsets, with n elements.
Φ_{i} are nbyn matrices for each i. The Φ_{i} are autoregressive matrices. There are p autoregressive matrices, and some can be entirely composed of zeros.
ε_{t} is a vector of serially uncorrelated innovations, vectors of length n. The ε_{t} are multivariate normal random vectors with a covariance matrix Σ.
Θ_{j} are nbyn matrices for each j. The Θ_{j} are moving average matrices. There are q moving average matrices, and some can be entirely composed of zeros.
δ is a constant vector of linear time trend coefficients, with n elements.
x_{t} is an rby1 vector representing exogenous terms at each time t. r is the number of exogenous series. Exogenous terms are data (or other unmodeled inputs) in addition to the response time series y_{t}. Each exogenous series appears in all response equations.
β is an nbyr constant matrix of regression coefficients of size r. So the product βx_{t} is a vector of size n.
Generally, the time series y_{t} and
x_{t} are observable. In other words,
if you have data, it represents one or both of these series. You do not always know
the offset c, trend coefficient δ, coefficient
β, autoregressive matrices
Φ_{i}, and moving average matrices
Θ_{j}. You typically want to fit
these parameters to your data. See estimate
for ways to estimate unknown parameters. The innovations
ε_{t} are not observable, at least in
data, though they can be observable in simulations.
Econometrics
Toolbox™ supports the creation and analysis of the VAR(p)
model using varm
and associated methods.
There is an equivalent representation of the linear autoregressive equations in terms of lag operators. The lag operator L moves the time index back by one: Ly_{t} = y_{t–1}. The operator L^{m} moves the time index back by m: L^{m}y_{t} = y_{t–m}.
In lag operator form, the equation for a SVARMAX(p, q) model becomes
$$\left({\Phi}_{0}{\displaystyle \sum _{i=1}^{p}{\Phi}_{i}{L}^{i}}\right){y}_{t}=c+\beta {x}_{t}+\left({\Theta}_{0}+{\displaystyle \sum _{j=1}^{q}{\Theta}_{j}{L}^{j}}\right){\epsilon}_{t}.$$
This equation can be written as
$$\Phi (L){y}_{t}=c+\beta {x}_{t}+\Theta (L){\epsilon}_{t},$$
where
$$\Theta (L)={\Theta}_{0}{\displaystyle \sum _{i=1}^{p}{\Theta}_{i}{L}^{i}}$$
and
$$\Theta (L)={\Theta}_{0}+{\displaystyle \sum _{j=1}^{q}{\Theta}_{j}{L}^{j}}.$$
$$\mathrm{det}\left({I}_{n}{\Phi}_{1}z{\Phi}_{2}{z}^{2}\mathrm{...}{\Phi}_{p}{z}^{p}\right)\ne 0\text{for}\leftz\right\le 1,$$
This condition implies that, with all innovations equal to zero, the VAR process converges to c as time goes on. See Lütkepohl [80] Chapter 2 for a discussion.
$$\mathrm{det}\left({I}_{n}+{\Theta}_{1}z+{\Theta}_{2}{z}^{2}+\mathrm{...}+{\Theta}_{q}{z}^{q}\right)\ne 0\text{for}\leftz\right\le 1.$$
This condition implies that the pure VAR representation of the process is stable. See Lütkepohl [80] Chapter 11 for a discussion of invertible VMA models.
A VARMA model is stable if its VAR polynomial is stable. Similarly, a VARMA model is invertible if its VMA polynomial is invertible.
There is no welldefined notion of stability or invertibility for models with exogenous inputs (e.g., VARMAX models). An exogenous input can destabilize a model.
To understand a multiple time series model, or multiple time series data, you generally perform the following steps:
Import and preprocess data.
Specify a model.
Creating VAR Models to set up a
model using varm
:
Model Objects with Known Parameters to specify a model with known parameters
Model Objects with No Parameter Values to specify a model when you want MATLAB^{®} to estimate the parameters
Model Objects with Selected Parameter Values to specify a model where you know some parameters, and want MATLAB to estimate the others
Determining an Appropriate Number of Lags to determine an appropriate number of lags for your model
Fit the model to data. See Fitting Models to Data to use estimate
to estimate the unknown
parameters in your models.
Analyze and forecast using the fitted model. This can involve:
Examining the Stability of a Fitted Model to determine whether your model is stable.
VAR Model Forecasting to forecast directly from models or to forecast using a Monte Carlo simulation.
Calculating Impulse Responses to calculate impulse responses, which give forecasts based on an assumed change in an input to a time series.
Compare the results of your model's forecasts to data held out for forecasting. For an example, see VAR Model Case Study.
Your application need not involve all of the steps in this workflow. For example, you might not have any data, but want to simulate a parameterized model. In that case, you would perform only steps 2 and 4 of the generic workflow.
You might iterate through some of these steps.